[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 15627 1726882459.62747: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 15627 1726882459.63214: Added group all to inventory 15627 1726882459.63216: Added group ungrouped to inventory 15627 1726882459.63221: Group all now contains ungrouped 15627 1726882459.63224: Examining possible inventory source: /tmp/network-91m/inventory.yml 15627 1726882459.88370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 15627 1726882459.88433: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 15627 1726882459.88459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 15627 1726882459.88725: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 15627 1726882459.88802: Loaded config def from plugin (inventory/script) 15627 1726882459.88804: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 15627 1726882459.88845: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 15627 1726882459.89007: Loaded config def from plugin (inventory/yaml) 15627 1726882459.89009: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 15627 1726882459.89104: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 15627 1726882459.89548: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 15627 1726882459.89552: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 15627 1726882459.89555: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 15627 1726882459.89561: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 15627 1726882459.89573: Loading data from /tmp/network-91m/inventory.yml 15627 1726882459.89645: /tmp/network-91m/inventory.yml was not parsable by auto 15627 1726882459.89719: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 15627 1726882459.89758: Loading data from /tmp/network-91m/inventory.yml 15627 1726882459.89840: group all already in inventory 15627 1726882459.89847: set inventory_file for managed_node1 15627 1726882459.89851: set inventory_dir for managed_node1 15627 1726882459.89852: Added host managed_node1 to inventory 15627 1726882459.89854: Added host managed_node1 to group all 15627 1726882459.89855: set ansible_host for managed_node1 15627 1726882459.89856: set ansible_ssh_extra_args for managed_node1 15627 1726882459.89859: set inventory_file for managed_node2 15627 1726882459.89862: set inventory_dir for managed_node2 15627 1726882459.89865: Added host managed_node2 to inventory 15627 1726882459.89866: Added host managed_node2 to group all 15627 1726882459.89867: set ansible_host for managed_node2 15627 1726882459.89868: set ansible_ssh_extra_args for managed_node2 15627 1726882459.89871: set inventory_file for managed_node3 15627 1726882459.89873: set inventory_dir for managed_node3 15627 1726882459.89874: Added host managed_node3 to inventory 15627 1726882459.89875: Added host managed_node3 to group all 15627 1726882459.89876: set ansible_host for managed_node3 15627 1726882459.89876: set ansible_ssh_extra_args for managed_node3 15627 1726882459.89879: Reconcile groups and hosts in inventory. 15627 1726882459.89883: Group ungrouped now contains managed_node1 15627 1726882459.89885: Group ungrouped now contains managed_node2 15627 1726882459.89886: Group ungrouped now contains managed_node3 15627 1726882459.89975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 15627 1726882459.90120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 15627 1726882459.90174: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 15627 1726882459.90204: Loaded config def from plugin (vars/host_group_vars) 15627 1726882459.90207: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 15627 1726882459.90214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 15627 1726882459.90231: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 15627 1726882459.90280: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 15627 1726882459.90677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882459.91146: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 15627 1726882459.91192: Loaded config def from plugin (connection/local) 15627 1726882459.91195: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 15627 1726882459.91813: Loaded config def from plugin (connection/paramiko_ssh) 15627 1726882459.91816: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 15627 1726882459.92916: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15627 1726882459.92956: Loaded config def from plugin (connection/psrp) 15627 1726882459.92959: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 15627 1726882459.93855: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15627 1726882459.93903: Loaded config def from plugin (connection/ssh) 15627 1726882459.93906: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 15627 1726882459.96550: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15627 1726882459.96593: Loaded config def from plugin (connection/winrm) 15627 1726882459.96596: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 15627 1726882459.96627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 15627 1726882459.96700: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 15627 1726882459.96777: Loaded config def from plugin (shell/cmd) 15627 1726882459.96780: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 15627 1726882459.96806: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 15627 1726882459.96878: Loaded config def from plugin (shell/powershell) 15627 1726882459.96881: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 15627 1726882459.96934: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 15627 1726882459.97138: Loaded config def from plugin (shell/sh) 15627 1726882459.97141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 15627 1726882459.97177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 15627 1726882459.97682: Loaded config def from plugin (become/runas) 15627 1726882459.97685: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 15627 1726882459.98112: Loaded config def from plugin (become/su) 15627 1726882459.98114: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 15627 1726882459.98563: Loaded config def from plugin (become/sudo) 15627 1726882459.98567: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 15627 1726882459.98602: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15627 1726882459.99331: in VariableManager get_vars() 15627 1726882459.99353: done with get_vars() 15627 1726882459.99502: trying /usr/local/lib/python3.12/site-packages/ansible/modules 15627 1726882460.07625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 15627 1726882460.08128: in VariableManager get_vars() 15627 1726882460.08137: done with get_vars() 15627 1726882460.08150: variable 'playbook_dir' from source: magic vars 15627 1726882460.08156: variable 'ansible_playbook_python' from source: magic vars 15627 1726882460.08158: variable 'ansible_config_file' from source: magic vars 15627 1726882460.08158: variable 'groups' from source: magic vars 15627 1726882460.08159: variable 'omit' from source: magic vars 15627 1726882460.08160: variable 'ansible_version' from source: magic vars 15627 1726882460.08161: variable 'ansible_check_mode' from source: magic vars 15627 1726882460.08162: variable 'ansible_diff_mode' from source: magic vars 15627 1726882460.08162: variable 'ansible_forks' from source: magic vars 15627 1726882460.08164: variable 'ansible_inventory_sources' from source: magic vars 15627 1726882460.08165: variable 'ansible_skip_tags' from source: magic vars 15627 1726882460.08166: variable 'ansible_limit' from source: magic vars 15627 1726882460.08167: variable 'ansible_run_tags' from source: magic vars 15627 1726882460.08168: variable 'ansible_verbosity' from source: magic vars 15627 1726882460.08217: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml 15627 1726882460.09322: in VariableManager get_vars() 15627 1726882460.09340: done with get_vars() 15627 1726882460.09442: in VariableManager get_vars() 15627 1726882460.09458: done with get_vars() 15627 1726882460.09506: in VariableManager get_vars() 15627 1726882460.09519: done with get_vars() 15627 1726882460.09722: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15627 1726882460.10450: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15627 1726882460.11166: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15627 1726882460.11920: in VariableManager get_vars() 15627 1726882460.11948: done with get_vars() 15627 1726882460.12420: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 15627 1726882460.12559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15627 1726882460.13805: in VariableManager get_vars() 15627 1726882460.13810: done with get_vars() 15627 1726882460.13813: variable 'playbook_dir' from source: magic vars 15627 1726882460.13814: variable 'ansible_playbook_python' from source: magic vars 15627 1726882460.13815: variable 'ansible_config_file' from source: magic vars 15627 1726882460.13816: variable 'groups' from source: magic vars 15627 1726882460.13817: variable 'omit' from source: magic vars 15627 1726882460.13818: variable 'ansible_version' from source: magic vars 15627 1726882460.13818: variable 'ansible_check_mode' from source: magic vars 15627 1726882460.13819: variable 'ansible_diff_mode' from source: magic vars 15627 1726882460.13820: variable 'ansible_forks' from source: magic vars 15627 1726882460.13821: variable 'ansible_inventory_sources' from source: magic vars 15627 1726882460.13822: variable 'ansible_skip_tags' from source: magic vars 15627 1726882460.13823: variable 'ansible_limit' from source: magic vars 15627 1726882460.13824: variable 'ansible_run_tags' from source: magic vars 15627 1726882460.13825: variable 'ansible_verbosity' from source: magic vars 15627 1726882460.13861: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15627 1726882460.13973: in VariableManager get_vars() 15627 1726882460.13986: done with get_vars() 15627 1726882460.14017: in VariableManager get_vars() 15627 1726882460.14021: done with get_vars() 15627 1726882460.14023: variable 'playbook_dir' from source: magic vars 15627 1726882460.14023: variable 'ansible_playbook_python' from source: magic vars 15627 1726882460.14024: variable 'ansible_config_file' from source: magic vars 15627 1726882460.14025: variable 'groups' from source: magic vars 15627 1726882460.14026: variable 'omit' from source: magic vars 15627 1726882460.14026: variable 'ansible_version' from source: magic vars 15627 1726882460.14027: variable 'ansible_check_mode' from source: magic vars 15627 1726882460.14028: variable 'ansible_diff_mode' from source: magic vars 15627 1726882460.14029: variable 'ansible_forks' from source: magic vars 15627 1726882460.14029: variable 'ansible_inventory_sources' from source: magic vars 15627 1726882460.14030: variable 'ansible_skip_tags' from source: magic vars 15627 1726882460.14031: variable 'ansible_limit' from source: magic vars 15627 1726882460.14031: variable 'ansible_run_tags' from source: magic vars 15627 1726882460.14032: variable 'ansible_verbosity' from source: magic vars 15627 1726882460.14061: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15627 1726882460.14122: in VariableManager get_vars() 15627 1726882460.14135: done with get_vars() 15627 1726882460.14184: in VariableManager get_vars() 15627 1726882460.14187: done with get_vars() 15627 1726882460.14189: variable 'playbook_dir' from source: magic vars 15627 1726882460.14190: variable 'ansible_playbook_python' from source: magic vars 15627 1726882460.14191: variable 'ansible_config_file' from source: magic vars 15627 1726882460.14192: variable 'groups' from source: magic vars 15627 1726882460.14193: variable 'omit' from source: magic vars 15627 1726882460.14194: variable 'ansible_version' from source: magic vars 15627 1726882460.14194: variable 'ansible_check_mode' from source: magic vars 15627 1726882460.14195: variable 'ansible_diff_mode' from source: magic vars 15627 1726882460.14196: variable 'ansible_forks' from source: magic vars 15627 1726882460.14202: variable 'ansible_inventory_sources' from source: magic vars 15627 1726882460.14203: variable 'ansible_skip_tags' from source: magic vars 15627 1726882460.14204: variable 'ansible_limit' from source: magic vars 15627 1726882460.14204: variable 'ansible_run_tags' from source: magic vars 15627 1726882460.14205: variable 'ansible_verbosity' from source: magic vars 15627 1726882460.14237: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 15627 1726882460.14309: in VariableManager get_vars() 15627 1726882460.14313: done with get_vars() 15627 1726882460.14315: variable 'playbook_dir' from source: magic vars 15627 1726882460.14316: variable 'ansible_playbook_python' from source: magic vars 15627 1726882460.14317: variable 'ansible_config_file' from source: magic vars 15627 1726882460.14318: variable 'groups' from source: magic vars 15627 1726882460.14318: variable 'omit' from source: magic vars 15627 1726882460.14319: variable 'ansible_version' from source: magic vars 15627 1726882460.14320: variable 'ansible_check_mode' from source: magic vars 15627 1726882460.14321: variable 'ansible_diff_mode' from source: magic vars 15627 1726882460.14321: variable 'ansible_forks' from source: magic vars 15627 1726882460.14322: variable 'ansible_inventory_sources' from source: magic vars 15627 1726882460.14323: variable 'ansible_skip_tags' from source: magic vars 15627 1726882460.14324: variable 'ansible_limit' from source: magic vars 15627 1726882460.14325: variable 'ansible_run_tags' from source: magic vars 15627 1726882460.14325: variable 'ansible_verbosity' from source: magic vars 15627 1726882460.14355: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 15627 1726882460.14423: in VariableManager get_vars() 15627 1726882460.14435: done with get_vars() 15627 1726882460.14482: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15627 1726882460.14599: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15627 1726882460.14787: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15627 1726882460.15191: in VariableManager get_vars() 15627 1726882460.15212: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15627 1726882460.16727: in VariableManager get_vars() 15627 1726882460.16741: done with get_vars() 15627 1726882460.16781: in VariableManager get_vars() 15627 1726882460.16785: done with get_vars() 15627 1726882460.16787: variable 'playbook_dir' from source: magic vars 15627 1726882460.16788: variable 'ansible_playbook_python' from source: magic vars 15627 1726882460.16789: variable 'ansible_config_file' from source: magic vars 15627 1726882460.16789: variable 'groups' from source: magic vars 15627 1726882460.16790: variable 'omit' from source: magic vars 15627 1726882460.16791: variable 'ansible_version' from source: magic vars 15627 1726882460.16792: variable 'ansible_check_mode' from source: magic vars 15627 1726882460.16792: variable 'ansible_diff_mode' from source: magic vars 15627 1726882460.16793: variable 'ansible_forks' from source: magic vars 15627 1726882460.16794: variable 'ansible_inventory_sources' from source: magic vars 15627 1726882460.16795: variable 'ansible_skip_tags' from source: magic vars 15627 1726882460.16795: variable 'ansible_limit' from source: magic vars 15627 1726882460.16796: variable 'ansible_run_tags' from source: magic vars 15627 1726882460.16797: variable 'ansible_verbosity' from source: magic vars 15627 1726882460.16829: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 15627 1726882460.16902: in VariableManager get_vars() 15627 1726882460.16914: done with get_vars() 15627 1726882460.16957: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15627 1726882460.17093: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15627 1726882460.17172: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15627 1726882460.17549: in VariableManager get_vars() 15627 1726882460.17705: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15627 1726882460.19222: in VariableManager get_vars() 15627 1726882460.19225: done with get_vars() 15627 1726882460.19228: variable 'playbook_dir' from source: magic vars 15627 1726882460.19229: variable 'ansible_playbook_python' from source: magic vars 15627 1726882460.19230: variable 'ansible_config_file' from source: magic vars 15627 1726882460.19230: variable 'groups' from source: magic vars 15627 1726882460.19231: variable 'omit' from source: magic vars 15627 1726882460.19232: variable 'ansible_version' from source: magic vars 15627 1726882460.19233: variable 'ansible_check_mode' from source: magic vars 15627 1726882460.19234: variable 'ansible_diff_mode' from source: magic vars 15627 1726882460.19234: variable 'ansible_forks' from source: magic vars 15627 1726882460.19235: variable 'ansible_inventory_sources' from source: magic vars 15627 1726882460.19236: variable 'ansible_skip_tags' from source: magic vars 15627 1726882460.19236: variable 'ansible_limit' from source: magic vars 15627 1726882460.19237: variable 'ansible_run_tags' from source: magic vars 15627 1726882460.19238: variable 'ansible_verbosity' from source: magic vars 15627 1726882460.19270: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15627 1726882460.19336: in VariableManager get_vars() 15627 1726882460.19349: done with get_vars() 15627 1726882460.19455: in VariableManager get_vars() 15627 1726882460.19459: done with get_vars() 15627 1726882460.19461: variable 'playbook_dir' from source: magic vars 15627 1726882460.19462: variable 'ansible_playbook_python' from source: magic vars 15627 1726882460.19463: variable 'ansible_config_file' from source: magic vars 15627 1726882460.19466: variable 'groups' from source: magic vars 15627 1726882460.19467: variable 'omit' from source: magic vars 15627 1726882460.19468: variable 'ansible_version' from source: magic vars 15627 1726882460.19468: variable 'ansible_check_mode' from source: magic vars 15627 1726882460.19469: variable 'ansible_diff_mode' from source: magic vars 15627 1726882460.19470: variable 'ansible_forks' from source: magic vars 15627 1726882460.19471: variable 'ansible_inventory_sources' from source: magic vars 15627 1726882460.19471: variable 'ansible_skip_tags' from source: magic vars 15627 1726882460.19472: variable 'ansible_limit' from source: magic vars 15627 1726882460.19473: variable 'ansible_run_tags' from source: magic vars 15627 1726882460.19474: variable 'ansible_verbosity' from source: magic vars 15627 1726882460.19503: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15627 1726882460.19703: in VariableManager get_vars() 15627 1726882460.19717: done with get_vars() 15627 1726882460.19785: in VariableManager get_vars() 15627 1726882460.19798: done with get_vars() 15627 1726882460.19879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 15627 1726882460.19893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 15627 1726882460.20128: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 15627 1726882460.22595: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 15627 1726882460.22603: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 15627 1726882460.22893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 15627 1726882460.22921: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 15627 1726882460.23121: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 15627 1726882460.23191: Loaded config def from plugin (callback/default) 15627 1726882460.23193: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15627 1726882460.24488: Loaded config def from plugin (callback/junit) 15627 1726882460.24491: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15627 1726882460.24570: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 15627 1726882460.24665: Loaded config def from plugin (callback/minimal) 15627 1726882460.24669: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15627 1726882460.24708: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15627 1726882460.24792: Loaded config def from plugin (callback/tree) 15627 1726882460.24795: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 15627 1726882460.24921: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 15627 1726882460.24924: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bridge_nm.yml ************************************************** 11 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15627 1726882460.24956: in VariableManager get_vars() 15627 1726882460.24978: done with get_vars() 15627 1726882460.24984: in VariableManager get_vars() 15627 1726882460.24993: done with get_vars() 15627 1726882460.24997: variable 'omit' from source: magic vars 15627 1726882460.25033: in VariableManager get_vars() 15627 1726882460.25048: done with get_vars() 15627 1726882460.25081: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bridge.yml' with nm as provider] *********** 15627 1726882460.25677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 15627 1726882460.25761: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 15627 1726882460.25793: getting the remaining hosts for this loop 15627 1726882460.25795: done getting the remaining hosts for this loop 15627 1726882460.25798: getting the next task for host managed_node1 15627 1726882460.25801: done getting next task for host managed_node1 15627 1726882460.25803: ^ task is: TASK: Gathering Facts 15627 1726882460.25804: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882460.25807: getting variables 15627 1726882460.25808: in VariableManager get_vars() 15627 1726882460.25817: Calling all_inventory to load vars for managed_node1 15627 1726882460.25820: Calling groups_inventory to load vars for managed_node1 15627 1726882460.25822: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882460.25833: Calling all_plugins_play to load vars for managed_node1 15627 1726882460.25847: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882460.25851: Calling groups_plugins_play to load vars for managed_node1 15627 1726882460.25891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882460.25942: done with get_vars() 15627 1726882460.25949: done getting variables 15627 1726882460.26015: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Friday 20 September 2024 21:34:20 -0400 (0:00:00.011) 0:00:00.012 ****** 15627 1726882460.26036: entering _queue_task() for managed_node1/gather_facts 15627 1726882460.26037: Creating lock for gather_facts 15627 1726882460.26399: worker is 1 (out of 1 available) 15627 1726882460.26411: exiting _queue_task() for managed_node1/gather_facts 15627 1726882460.26423: done queuing things up, now waiting for results queue to drain 15627 1726882460.26425: waiting for pending results... 15627 1726882460.26694: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15627 1726882460.26813: in run() - task 0e448fcc-3ce9-2847-7723-00000000007e 15627 1726882460.26838: variable 'ansible_search_path' from source: unknown 15627 1726882460.26885: calling self._execute() 15627 1726882460.26949: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882460.26965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882460.26982: variable 'omit' from source: magic vars 15627 1726882460.27094: variable 'omit' from source: magic vars 15627 1726882460.27124: variable 'omit' from source: magic vars 15627 1726882460.27175: variable 'omit' from source: magic vars 15627 1726882460.27226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882460.27276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882460.27303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882460.27326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882460.27343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882460.27393: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882460.27402: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882460.27410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882460.27525: Set connection var ansible_timeout to 10 15627 1726882460.27542: Set connection var ansible_shell_executable to /bin/sh 15627 1726882460.27553: Set connection var ansible_connection to ssh 15627 1726882460.27569: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882460.27580: Set connection var ansible_pipelining to False 15627 1726882460.27587: Set connection var ansible_shell_type to sh 15627 1726882460.27659: variable 'ansible_shell_executable' from source: unknown 15627 1726882460.27671: variable 'ansible_connection' from source: unknown 15627 1726882460.27680: variable 'ansible_module_compression' from source: unknown 15627 1726882460.27688: variable 'ansible_shell_type' from source: unknown 15627 1726882460.27695: variable 'ansible_shell_executable' from source: unknown 15627 1726882460.27704: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882460.27718: variable 'ansible_pipelining' from source: unknown 15627 1726882460.27726: variable 'ansible_timeout' from source: unknown 15627 1726882460.27734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882460.27938: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882460.27953: variable 'omit' from source: magic vars 15627 1726882460.27970: starting attempt loop 15627 1726882460.27979: running the handler 15627 1726882460.27999: variable 'ansible_facts' from source: unknown 15627 1726882460.28022: _low_level_execute_command(): starting 15627 1726882460.28041: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882460.28877: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882460.28893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882460.28909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.28934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.28987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882460.29001: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882460.29016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.29037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882460.29060: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882460.29078: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882460.29092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882460.29107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.29124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.29137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882460.29157: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882460.29178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.29259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882460.29295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882460.29312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882460.29445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882460.31113: stdout chunk (state=3): >>>/root <<< 15627 1726882460.31315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882460.31319: stdout chunk (state=3): >>><<< 15627 1726882460.31321: stderr chunk (state=3): >>><<< 15627 1726882460.31444: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882460.31448: _low_level_execute_command(): starting 15627 1726882460.31451: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882460.3134267-15670-28312173966363 `" && echo ansible-tmp-1726882460.3134267-15670-28312173966363="` echo /root/.ansible/tmp/ansible-tmp-1726882460.3134267-15670-28312173966363 `" ) && sleep 0' 15627 1726882460.32739: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882460.32753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.32769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882460.32879: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882460.32889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.32903: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882460.32910: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882460.32917: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882460.32925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882460.32934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.32946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.32953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882460.32959: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882460.32970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.33041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882460.33683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882460.33694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882460.33814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882460.35715: stdout chunk (state=3): >>>ansible-tmp-1726882460.3134267-15670-28312173966363=/root/.ansible/tmp/ansible-tmp-1726882460.3134267-15670-28312173966363 <<< 15627 1726882460.35892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882460.35895: stdout chunk (state=3): >>><<< 15627 1726882460.35902: stderr chunk (state=3): >>><<< 15627 1726882460.36369: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882460.3134267-15670-28312173966363=/root/.ansible/tmp/ansible-tmp-1726882460.3134267-15670-28312173966363 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882460.36372: variable 'ansible_module_compression' from source: unknown 15627 1726882460.36375: ANSIBALLZ: Using generic lock for ansible.legacy.setup 15627 1726882460.36377: ANSIBALLZ: Acquiring lock 15627 1726882460.36379: ANSIBALLZ: Lock acquired: 140251854220672 15627 1726882460.36381: ANSIBALLZ: Creating module 15627 1726882460.71450: ANSIBALLZ: Writing module into payload 15627 1726882460.71622: ANSIBALLZ: Writing module 15627 1726882460.71662: ANSIBALLZ: Renaming module 15627 1726882460.71676: ANSIBALLZ: Done creating module 15627 1726882460.71716: variable 'ansible_facts' from source: unknown 15627 1726882460.71729: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882460.71745: _low_level_execute_command(): starting 15627 1726882460.71758: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 15627 1726882460.72428: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882460.72441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882460.72458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.72481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.72523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882460.72535: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882460.72548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.72574: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882460.72587: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882460.72597: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882460.72609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882460.72621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.72635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.72646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882460.72659: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882460.72676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.72752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882460.72776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882460.72792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882460.73145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882460.74597: stdout chunk (state=3): >>>PLATFORM <<< 15627 1726882460.74703: stdout chunk (state=3): >>>Linux FOUND <<< 15627 1726882460.74706: stdout chunk (state=3): >>>/usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 15627 1726882460.74936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882460.74939: stdout chunk (state=3): >>><<< 15627 1726882460.74942: stderr chunk (state=3): >>><<< 15627 1726882460.75089: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882460.75100 [managed_node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 15627 1726882460.75103: _low_level_execute_command(): starting 15627 1726882460.75106: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 15627 1726882460.75169: Sending initial data 15627 1726882460.75172: Sent initial data (1181 bytes) 15627 1726882460.75718: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882460.75733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882460.75754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.75774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.75817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882460.75828: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882460.75844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.75869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882460.75882: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882460.75892: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882460.75903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882460.75916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.75930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.75941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882460.75951: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882460.75975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.76050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882460.76082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882460.76098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882460.76218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882460.80056: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 15627 1726882460.80443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882460.80546: stderr chunk (state=3): >>><<< 15627 1726882460.80559: stdout chunk (state=3): >>><<< 15627 1726882460.80874: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882460.80878: variable 'ansible_facts' from source: unknown 15627 1726882460.80880: variable 'ansible_facts' from source: unknown 15627 1726882460.80882: variable 'ansible_module_compression' from source: unknown 15627 1726882460.80885: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15627 1726882460.80886: variable 'ansible_facts' from source: unknown 15627 1726882460.80940: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882460.3134267-15670-28312173966363/AnsiballZ_setup.py 15627 1726882460.81119: Sending initial data 15627 1726882460.81122: Sent initial data (153 bytes) 15627 1726882460.82143: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882460.82157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882460.82177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.82203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.82244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882460.82255: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882460.82274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.82299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882460.82311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882460.82321: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882460.82332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882460.82345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.82360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.82374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882460.82385: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882460.82398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.82496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882460.82526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882460.82542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882460.82669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882460.84465: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882460.84553: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882460.84648: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpp5nay497 /root/.ansible/tmp/ansible-tmp-1726882460.3134267-15670-28312173966363/AnsiballZ_setup.py <<< 15627 1726882460.84740: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882460.87286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882460.87470: stderr chunk (state=3): >>><<< 15627 1726882460.87473: stdout chunk (state=3): >>><<< 15627 1726882460.87598: done transferring module to remote 15627 1726882460.87602: _low_level_execute_command(): starting 15627 1726882460.87605: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882460.3134267-15670-28312173966363/ /root/.ansible/tmp/ansible-tmp-1726882460.3134267-15670-28312173966363/AnsiballZ_setup.py && sleep 0' 15627 1726882460.88229: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882460.88246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882460.88269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.88311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.88356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882460.88378: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882460.88393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.88412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882460.88425: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882460.88437: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882460.88449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882460.88466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.88489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.88502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882460.88514: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882460.88528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.88613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882460.88635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882460.88652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882460.88778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882460.90670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882460.90703: stderr chunk (state=3): >>><<< 15627 1726882460.90714: stdout chunk (state=3): >>><<< 15627 1726882460.90821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882460.90825: _low_level_execute_command(): starting 15627 1726882460.90828: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882460.3134267-15670-28312173966363/AnsiballZ_setup.py && sleep 0' 15627 1726882460.92373: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.92377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882460.92410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.92413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882460.92416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882460.92418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882460.92486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882460.93102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882460.93202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882460.95227: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15627 1726882460.95301: stdout chunk (state=3): >>>import '_io' # <<< 15627 1726882460.95304: stdout chunk (state=3): >>>import 'marshal' # <<< 15627 1726882460.95325: stdout chunk (state=3): >>>import 'posix' # <<< 15627 1726882460.95362: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15627 1726882460.95412: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 15627 1726882460.95415: stdout chunk (state=3): >>># installed zipimport hook <<< 15627 1726882460.95470: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882460.95503: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 15627 1726882460.95519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 15627 1726882460.95533: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67019b3dc0> <<< 15627 1726882460.95572: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 15627 1726882460.95600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67019583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67019b3b20> <<< 15627 1726882460.95632: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 15627 1726882460.95636: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67019b3ac0> <<< 15627 1726882460.95671: stdout chunk (state=3): >>>import '_signal' # <<< 15627 1726882460.95701: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 15627 1726882460.95731: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701958490> <<< 15627 1726882460.95764: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 15627 1726882460.95769: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701958940> <<< 15627 1726882460.95787: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701958670> <<< 15627 1726882460.95822: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 15627 1726882460.95844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 15627 1726882460.95847: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 15627 1726882460.95882: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 15627 1726882460.95885: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 15627 1726882460.95922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 15627 1726882460.95941: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670190f190> <<< 15627 1726882460.95957: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 15627 1726882460.95961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 15627 1726882460.96049: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670190f220> <<< 15627 1726882460.96080: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 15627 1726882460.96104: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701932850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670190f940> <<< 15627 1726882460.96128: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701970880> <<< 15627 1726882460.96153: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701908d90> <<< 15627 1726882460.96219: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701932d90> <<< 15627 1726882460.96277: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701958970> <<< 15627 1726882460.96291: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15627 1726882460.96630: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 15627 1726882460.96648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 15627 1726882460.96689: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 15627 1726882460.96693: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 15627 1726882460.96729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 15627 1726882460.96741: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018d3eb0> <<< 15627 1726882460.96784: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018d6f40> <<< 15627 1726882460.96827: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 15627 1726882460.96830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 15627 1726882460.96856: stdout chunk (state=3): >>>import '_sre' # <<< 15627 1726882460.96859: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 15627 1726882460.96883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 15627 1726882460.96894: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 15627 1726882460.96931: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018cc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018d2640> <<< 15627 1726882460.96941: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018d3370> <<< 15627 1726882460.96958: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 15627 1726882460.97032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 15627 1726882460.97053: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 15627 1726882460.97080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882460.97104: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 15627 1726882460.97170: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6701591dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015918b0> <<< 15627 1726882460.97175: stdout chunk (state=3): >>>import 'itertools' # <<< 15627 1726882460.97197: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701591eb0> <<< 15627 1726882460.97202: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 15627 1726882460.97225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 15627 1726882460.97251: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701591f70> <<< 15627 1726882460.97288: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701591e80> <<< 15627 1726882460.97291: stdout chunk (state=3): >>>import '_collections' # <<< 15627 1726882460.97344: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018aed30> <<< 15627 1726882460.97347: stdout chunk (state=3): >>>import '_functools' # <<< 15627 1726882460.97370: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018a7610> <<< 15627 1726882460.97445: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 15627 1726882460.97460: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018bb670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018dae20> <<< 15627 1726882460.97462: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 15627 1726882460.97495: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67015a3c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018ae250> <<< 15627 1726882460.97531: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882460.97565: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67018bb280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018e09d0> <<< 15627 1726882460.97568: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 15627 1726882460.97605: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 15627 1726882460.97613: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882460.97640: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 15627 1726882460.97643: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a3fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a3d90> <<< 15627 1726882460.97689: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 15627 1726882460.97702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a3d00> <<< 15627 1726882460.97705: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 15627 1726882460.97734: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 15627 1726882460.97758: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 15627 1726882460.97809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 15627 1726882460.97856: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701576370> <<< 15627 1726882460.97874: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 15627 1726882460.97877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 15627 1726882460.97901: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701576460> <<< 15627 1726882460.98027: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015abfa0> <<< 15627 1726882460.98081: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a5a30> <<< 15627 1726882460.98106: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a5490> <<< 15627 1726882460.98109: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 15627 1726882460.98143: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 15627 1726882460.98171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 15627 1726882460.98185: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014aa1c0> <<< 15627 1726882460.98212: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701561c70> <<< 15627 1726882460.98281: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a5eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018e0040> <<< 15627 1726882460.98300: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 15627 1726882460.98337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 15627 1726882460.98353: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014bcaf0> import 'errno' # <<< 15627 1726882460.98404: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67014bce20> <<< 15627 1726882460.98434: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 15627 1726882460.98455: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014ce730> <<< 15627 1726882460.98472: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 15627 1726882460.98506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 15627 1726882460.98525: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014cec70> <<< 15627 1726882460.98566: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67014663a0> <<< 15627 1726882460.98593: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014bcf10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 15627 1726882460.98611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 15627 1726882460.98652: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6701477280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014ce5b0> <<< 15627 1726882460.98681: stdout chunk (state=3): >>>import 'pwd' # <<< 15627 1726882460.98697: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6701477340> <<< 15627 1726882460.98738: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a39d0> <<< 15627 1726882460.98758: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 15627 1726882460.98778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 15627 1726882460.98802: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 15627 1726882460.98846: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67014926a0> <<< 15627 1726882460.98876: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 15627 1726882460.98901: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6701492970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701492760> <<< 15627 1726882460.98915: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6701492850> <<< 15627 1726882460.98944: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 15627 1726882460.99141: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6701492ca0> <<< 15627 1726882460.99182: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f670149f1f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014928e0> <<< 15627 1726882460.99205: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701486a30> <<< 15627 1726882460.99222: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a35b0> <<< 15627 1726882460.99233: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 15627 1726882460.99296: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 15627 1726882460.99327: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701492a90> <<< 15627 1726882460.99497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 15627 1726882460.99500: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f67013c1670> <<< 15627 1726882460.99782: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 15627 1726882460.99885: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882460.99923: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 15627 1726882460.99949: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882460.99952: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 15627 1726882461.01188: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.02188: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d047f0> <<< 15627 1726882461.02225: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d95760> <<< 15627 1726882461.02265: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d95640> <<< 15627 1726882461.02289: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d95370> <<< 15627 1726882461.02309: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 15627 1726882461.02368: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d95490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d95190> <<< 15627 1726882461.02372: stdout chunk (state=3): >>>import 'atexit' # <<< 15627 1726882461.02414: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d95400> <<< 15627 1726882461.02432: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 15627 1726882461.02447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 15627 1726882461.02494: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d957c0> <<< 15627 1726882461.02515: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 15627 1726882461.02536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 15627 1726882461.02567: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 15627 1726882461.02587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 15627 1726882461.02590: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 15627 1726882461.02660: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d6e7c0> <<< 15627 1726882461.02709: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d6eb50> <<< 15627 1726882461.02739: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d6e9a0> <<< 15627 1726882461.02756: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 15627 1726882461.02760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 15627 1726882461.02795: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700c874f0> <<< 15627 1726882461.02813: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d8ed30> <<< 15627 1726882461.02976: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d95520> <<< 15627 1726882461.03028: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 15627 1726882461.03059: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d8e190> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 15627 1726882461.03091: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 15627 1726882461.03136: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 15627 1726882461.03141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700dbfa90> <<< 15627 1726882461.03216: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d62190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d62790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700c8dd00> <<< 15627 1726882461.03264: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d626a0> <<< 15627 1726882461.03298: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700de3d30> <<< 15627 1726882461.03314: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 15627 1726882461.03325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 15627 1726882461.03368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 15627 1726882461.03428: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700ce59a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700deee50> <<< 15627 1726882461.03473: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 15627 1726882461.03476: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 15627 1726882461.03532: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700cf50d0> <<< 15627 1726882461.03536: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700deee20> <<< 15627 1726882461.03547: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 15627 1726882461.03605: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882461.03619: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 15627 1726882461.03677: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700df5220> <<< 15627 1726882461.03801: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700cf5100> <<< 15627 1726882461.03898: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700db9b80> <<< 15627 1726882461.03931: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700deeac0> <<< 15627 1726882461.03981: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700deed00> <<< 15627 1726882461.03995: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67013c1820> <<< 15627 1726882461.04022: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 15627 1726882461.04034: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 15627 1726882461.04051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 15627 1726882461.04084: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700cf10d0> <<< 15627 1726882461.04301: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700ce7370> <<< 15627 1726882461.04305: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700cf1d00> <<< 15627 1726882461.04337: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700cf16a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700cf2130> # zipimport: zlib available <<< 15627 1726882461.04364: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 15627 1726882461.04367: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.04438: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.04535: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15627 1726882461.04539: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 15627 1726882461.04574: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 15627 1726882461.04577: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.04676: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.04768: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.05239: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.05724: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 15627 1726882461.05728: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 15627 1726882461.05746: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882461.05812: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d2d8b0> <<< 15627 1726882461.05887: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 15627 1726882461.05906: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d33910> <<< 15627 1726882461.05909: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008a56a0> <<< 15627 1726882461.05949: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 15627 1726882461.05952: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.05986: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.05999: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 15627 1726882461.06116: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.06269: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 15627 1726882461.06283: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d6c7f0> # zipimport: zlib available <<< 15627 1726882461.06670: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.07033: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.07089: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.07165: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 15627 1726882461.07169: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.07195: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.07239: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 15627 1726882461.07243: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.07285: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.07391: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 15627 1726882461.07395: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 15627 1726882461.07412: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.07434: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.07479: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 15627 1726882461.07482: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.07662: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.07852: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 15627 1726882461.07890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 15627 1726882461.07893: stdout chunk (state=3): >>>import '_ast' # <<< 15627 1726882461.07970: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008a9d90> # zipimport: zlib available <<< 15627 1726882461.08029: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.08116: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 15627 1726882461.08122: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 15627 1726882461.08138: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.08168: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.08210: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 15627 1726882461.08217: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.08247: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.08279: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.08375: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.08429: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 15627 1726882461.08468: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882461.08533: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d200a0> <<< 15627 1726882461.08632: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670086e070> <<< 15627 1726882461.08686: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 15627 1726882461.08690: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.08728: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.08787: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.08812: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.08851: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 15627 1726882461.08869: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 15627 1726882461.08888: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 15627 1726882461.08911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 15627 1726882461.08935: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 15627 1726882461.08954: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 15627 1726882461.09034: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d29160> <<< 15627 1726882461.09074: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d25cd0> <<< 15627 1726882461.09143: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008a9bb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 15627 1726882461.09147: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09191: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09195: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 15627 1726882461.09306: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available <<< 15627 1726882461.09309: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 15627 1726882461.09367: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09421: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09453: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09459: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09493: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09523: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09553: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09586: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 15627 1726882461.09603: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09662: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09747: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09750: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.09782: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 15627 1726882461.09934: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.10072: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.10107: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.10146: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882461.10196: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 15627 1726882461.10213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 15627 1726882461.10216: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 15627 1726882461.10242: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700624a60> <<< 15627 1726882461.10276: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 15627 1726882461.10280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 15627 1726882461.10292: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 15627 1726882461.10345: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 15627 1726882461.10349: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 15627 1726882461.10369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008836d0> <<< 15627 1726882461.10402: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700883af0> <<< 15627 1726882461.10479: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670086a250> <<< 15627 1726882461.10483: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670086aa30> <<< 15627 1726882461.10523: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008b9460> <<< 15627 1726882461.10526: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008b9910> <<< 15627 1726882461.10546: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 15627 1726882461.10579: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 15627 1726882461.10633: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67008b5d00> <<< 15627 1726882461.10644: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008b5d60> <<< 15627 1726882461.10666: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 15627 1726882461.10690: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008b5250> <<< 15627 1726882461.10706: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 15627 1726882461.10720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 15627 1726882461.10747: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f670068cf70> <<< 15627 1726882461.10778: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008c6b50> <<< 15627 1726882461.10815: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008b9310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 15627 1726882461.10827: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.10863: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available <<< 15627 1726882461.10918: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.10978: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 15627 1726882461.10982: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.11018: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.11075: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 15627 1726882461.11105: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 15627 1726882461.11108: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.11129: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.11157: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 15627 1726882461.11205: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.11256: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 15627 1726882461.11260: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.11288: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.11334: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 15627 1726882461.11392: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.11438: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.11484: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.11535: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 15627 1726882461.11552: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.11944: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12314: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 15627 1726882461.12357: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12405: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12435: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12470: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 15627 1726882461.12474: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12502: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12531: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 15627 1726882461.12536: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12583: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12635: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available <<< 15627 1726882461.12673: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12692: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 15627 1726882461.12710: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12724: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12759: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 15627 1726882461.12766: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12827: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.12903: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 15627 1726882461.12930: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67005a4ca0> <<< 15627 1726882461.12945: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 15627 1726882461.12973: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 15627 1726882461.13132: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67005a4fd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 15627 1726882461.13141: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.13191: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.13253: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 15627 1726882461.13258: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.13333: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.13414: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 15627 1726882461.13472: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.13542: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 15627 1726882461.13582: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.13622: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 15627 1726882461.13646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 15627 1726882461.13795: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700598370> <<< 15627 1726882461.14069: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67005e7bb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 15627 1726882461.14073: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.14099: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.14147: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 15627 1726882461.14153: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.14224: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.14294: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.14387: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.14528: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 15627 1726882461.14534: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.14563: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.14606: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 15627 1726882461.14611: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.14646: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.14688: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 15627 1726882461.14741: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882461.14760: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f670051e160> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670051e2b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available <<< 15627 1726882461.14782: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 15627 1726882461.14788: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.14827: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.14868: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 15627 1726882461.14876: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.15002: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.15132: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 15627 1726882461.15138: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.15216: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.15297: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.15331: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.15368: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 15627 1726882461.15380: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 15627 1726882461.15476: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.15489: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.15607: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.15726: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 15627 1726882461.15738: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.15840: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.15945: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 15627 1726882461.15952: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.15979: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.16009: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.16451: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.16863: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 15627 1726882461.16875: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.16959: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.17049: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 15627 1726882461.17133: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.17217: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 15627 1726882461.17226: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.17345: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.17478: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 15627 1726882461.17505: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 15627 1726882461.17512: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.17548: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.17595: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available <<< 15627 1726882461.17679: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.17759: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.17928: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18095: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 15627 1726882461.18105: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18137: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18177: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 15627 1726882461.18197: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18224: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 15627 1726882461.18231: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18292: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18344: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 15627 1726882461.18360: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18381: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18403: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 15627 1726882461.18411: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18459: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18509: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 15627 1726882461.18516: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18561: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18616: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 15627 1726882461.18622: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.18838: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19055: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 15627 1726882461.19062: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19107: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19161: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 15627 1726882461.19170: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19197: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19228: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 15627 1726882461.19235: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19260: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19300: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 15627 1726882461.19306: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19329: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19367: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 15627 1726882461.19373: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19431: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19507: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 15627 1726882461.19530: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19541: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 15627 1726882461.19548: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19582: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19629: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 15627 1726882461.19636: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19655: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19668: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19709: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19752: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19811: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19878: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 15627 1726882461.19892: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 15627 1726882461.19931: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.19982: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 15627 1726882461.19989: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.20144: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.20304: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 15627 1726882461.20311: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.20346: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.20394: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 15627 1726882461.20439: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.20489: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 15627 1726882461.20552: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.20622: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 15627 1726882461.20637: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.20707: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.20792: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 15627 1726882461.20864: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882461.21575: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 15627 1726882461.21597: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 15627 1726882461.21603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 15627 1726882461.21643: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67005728b0> <<< 15627 1726882461.21653: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700572310> <<< 15627 1726882461.21710: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700388850> <<< 15627 1726882461.23122: stdout chunk (state=3): >>>import 'gc' # <<< 15627 1726882461.25393: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700572370> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670051eeb0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700383310> <<< 15627 1726882461.25406: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670051a700> <<< 15627 1726882461.25619: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 15627 1726882461.25627: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 15627 1726882461.51213: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.42, "5m": 0.35, "15m": 0.18}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "L<<< 15627 1726882461.51235: stdout chunk (state=3): >>>OGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2810, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 722, "free": 2810}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version"<<< 15627 1726882461.51279: stdout chunk (state=3): >>>: "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 619, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241455104, "block_size": 4096, "block_total": 65519355, "block_available": 64512074, "block_used": 1007281, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "21", "epoch": "1726882461", "epoch_int": "1726882461", "date": "2024-09-20", "time": "21:34:21", "iso8601_micro": "2024-09-21T01:34:21.456583Z", "iso8601": "2024-09-21T01:34:21Z", "iso8601_basic": "20240920T213421456583", "iso8601_basic_short": "20240920T213421", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15627 1726882461.52170: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2<<< 15627 1726882461.52179: stdout chunk (state=3): >>> # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 15627 1726882461.52187: stdout chunk (state=3): >>># clear sys.meta_path <<< 15627 1726882461.52190: stdout chunk (state=3): >>># clear sys.__interactivehook__ # restore sys.stdin <<< 15627 1726882461.52196: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys <<< 15627 1726882461.52199: stdout chunk (state=3): >>># cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp <<< 15627 1726882461.52205: stdout chunk (state=3): >>># cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref<<< 15627 1726882461.52331: stdout chunk (state=3): >>> # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix <<< 15627 1726882461.52337: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport <<< 15627 1726882461.52343: stdout chunk (state=3): >>># cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8<<< 15627 1726882461.52349: stdout chunk (state=3): >>> # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc <<< 15627 1726882461.52379: stdout chunk (state=3): >>># cleanup[2] removing abc # cleanup[2] removing io<<< 15627 1726882461.52409: stdout chunk (state=3): >>> # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc<<< 15627 1726882461.52450: stdout chunk (state=3): >>> # cleanup[2] removing genericpath # cleanup[2] removing posixpath <<< 15627 1726882461.52487: stdout chunk (state=3): >>># cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale <<< 15627 1726882461.52529: stdout chunk (state=3): >>># cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site<<< 15627 1726882461.52556: stdout chunk (state=3): >>> # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre <<< 15627 1726882461.52589: stdout chunk (state=3): >>># cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile<<< 15627 1726882461.52633: stdout chunk (state=3): >>> # cleanup[2] removing _heapq # cleanup[2] removing heapq<<< 15627 1726882461.52650: stdout chunk (state=3): >>> # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 15627 1726882461.52698: stdout chunk (state=3): >>># cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib<<< 15627 1726882461.52730: stdout chunk (state=3): >>> # destroy reprlib # cleanup[2] removing _collections<<< 15627 1726882461.52755: stdout chunk (state=3): >>> # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg<<< 15627 1726882461.53215: stdout chunk (state=3): >>> # cleanup[2] removing re # cleanup[2] removing _struct <<< 15627 1726882461.53236: stdout chunk (state=3): >>># cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__<<< 15627 1726882461.53256: stdout chunk (state=3): >>> # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder<<< 15627 1726882461.53275: stdout chunk (state=3): >>> # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl <<< 15627 1726882461.53327: stdout chunk (state=3): >>># cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex<<< 15627 1726882461.53343: stdout chunk (state=3): >>> # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon<<< 15627 1726882461.53368: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text<<< 15627 1726882461.53380: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves<<< 15627 1726882461.53394: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes <<< 15627 1726882461.53405: stdout chunk (state=3): >>># destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings<<< 15627 1726882461.53421: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast<<< 15627 1726882461.53438: stdout chunk (state=3): >>> # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters <<< 15627 1726882461.53452: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file<<< 15627 1726882461.53473: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext<<< 15627 1726882461.53483: stdout chunk (state=3): >>> # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules<<< 15627 1726882461.53500: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction<<< 15627 1726882461.53513: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection <<< 15627 1726882461.53532: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot <<< 15627 1726882461.53562: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local <<< 15627 1726882461.53600: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd<<< 15627 1726882461.53650: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts<<< 15627 1726882461.53677: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb<<< 15627 1726882461.53703: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux<<< 15627 1726882461.53713: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly<<< 15627 1726882461.53731: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd <<< 15627 1726882461.53739: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly<<< 15627 1726882461.53756: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc<<< 15627 1726882461.54068: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 15627 1726882461.54269: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15627 1726882461.54312: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 15627 1726882461.54361: stdout chunk (state=3): >>># destroy zipimport <<< 15627 1726882461.54389: stdout chunk (state=3): >>># destroy _compression <<< 15627 1726882461.54427: stdout chunk (state=3): >>># destroy binascii # destroy importlib <<< 15627 1726882461.54439: stdout chunk (state=3): >>># destroy bz2 # destroy lzma <<< 15627 1726882461.54549: stdout chunk (state=3): >>># destroy __main__ # destroy locale<<< 15627 1726882461.54552: stdout chunk (state=3): >>> # destroy systemd.journal <<< 15627 1726882461.54557: stdout chunk (state=3): >>># destroy systemd.daemon # destroy hashlib<<< 15627 1726882461.54563: stdout chunk (state=3): >>> <<< 15627 1726882461.54568: stdout chunk (state=3): >>># destroy json.decoder <<< 15627 1726882461.54589: stdout chunk (state=3): >>># destroy json.encoder <<< 15627 1726882461.54603: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy encodings <<< 15627 1726882461.54647: stdout chunk (state=3): >>># destroy syslog <<< 15627 1726882461.54666: stdout chunk (state=3): >>># destroy uuid <<< 15627 1726882461.54730: stdout chunk (state=3): >>># destroy selinux<<< 15627 1726882461.54752: stdout chunk (state=3): >>> # destroy distro # destroy logging # destroy argparse<<< 15627 1726882461.54775: stdout chunk (state=3): >>> <<< 15627 1726882461.54829: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors <<< 15627 1726882461.54860: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing<<< 15627 1726882461.54891: stdout chunk (state=3): >>> # destroy multiprocessing.queues # destroy multiprocessing.synchronize <<< 15627 1726882461.54919: stdout chunk (state=3): >>># destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 15627 1726882461.54958: stdout chunk (state=3): >>># destroy queue <<< 15627 1726882461.54981: stdout chunk (state=3): >>># destroy multiprocessing.reduction<<< 15627 1726882461.54995: stdout chunk (state=3): >>> <<< 15627 1726882461.55020: stdout chunk (state=3): >>># destroy shlex<<< 15627 1726882461.55039: stdout chunk (state=3): >>> # destroy datetime<<< 15627 1726882461.55060: stdout chunk (state=3): >>> <<< 15627 1726882461.55074: stdout chunk (state=3): >>># destroy base64 <<< 15627 1726882461.55101: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 15627 1726882461.55123: stdout chunk (state=3): >>># destroy getpass <<< 15627 1726882461.55144: stdout chunk (state=3): >>># destroy json <<< 15627 1726882461.55184: stdout chunk (state=3): >>># destroy socket # destroy struct<<< 15627 1726882461.55209: stdout chunk (state=3): >>> # destroy glob<<< 15627 1726882461.55234: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout<<< 15627 1726882461.55262: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection <<< 15627 1726882461.55290: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context <<< 15627 1726882461.55317: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy multiprocessing.util # destroy array<<< 15627 1726882461.55333: stdout chunk (state=3): >>> # destroy multiprocessing.dummy.connection <<< 15627 1726882461.55393: stdout chunk (state=3): >>># cleanup[3] wiping gc <<< 15627 1726882461.55423: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 15627 1726882461.55450: stdout chunk (state=3): >>># cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl<<< 15627 1726882461.55490: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle <<< 15627 1726882461.55513: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 15627 1726882461.55544: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 15627 1726882461.55574: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser <<< 15627 1726882461.55604: stdout chunk (state=3): >>># cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 <<< 15627 1726882461.55632: stdout chunk (state=3): >>># cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal<<< 15627 1726882461.55658: stdout chunk (state=3): >>> # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime<<< 15627 1726882461.55701: stdout chunk (state=3): >>> # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 15627 1726882461.55722: stdout chunk (state=3): >>># cleanup[3] wiping platform <<< 15627 1726882461.55746: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors <<< 15627 1726882461.55776: stdout chunk (state=3): >>># cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess <<< 15627 1726882461.55811: stdout chunk (state=3): >>># cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit<<< 15627 1726882461.55833: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 <<< 15627 1726882461.55866: stdout chunk (state=3): >>># cleanup[3] wiping _hashlib # cleanup[3] wiping _random <<< 15627 1726882461.55891: stdout chunk (state=3): >>># cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 15627 1726882461.55925: stdout chunk (state=3): >>># destroy fnmatch # cleanup[3] wiping grp<<< 15627 1726882461.55947: stdout chunk (state=3): >>> # cleanup[3] wiping pwd # cleanup[3] wiping _lzma <<< 15627 1726882461.55973: stdout chunk (state=3): >>># cleanup[3] wiping threading # cleanup[3] wiping zlib<<< 15627 1726882461.55997: stdout chunk (state=3): >>> # cleanup[3] wiping errno # cleanup[3] wiping weakref<<< 15627 1726882461.56036: stdout chunk (state=3): >>> # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 15627 1726882461.56050: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external <<< 15627 1726882461.56079: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap <<< 15627 1726882461.56105: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re <<< 15627 1726882461.56135: stdout chunk (state=3): >>># destroy enum # destroy sre_compile # destroy copyreg<<< 15627 1726882461.56159: stdout chunk (state=3): >>> # cleanup[3] wiping functools <<< 15627 1726882461.56187: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections <<< 15627 1726882461.56214: stdout chunk (state=3): >>># destroy _collections_abc # destroy heapq # destroy collections.abc<<< 15627 1726882461.56242: stdout chunk (state=3): >>> # cleanup[3] wiping _collections # destroy _collections <<< 15627 1726882461.56281: stdout chunk (state=3): >>># cleanup[3] wiping operator # cleanup[3] wiping _operator <<< 15627 1726882461.56299: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 15627 1726882461.56329: stdout chunk (state=3): >>># cleanup[3] wiping _sre <<< 15627 1726882461.56359: stdout chunk (state=3): >>># cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale <<< 15627 1726882461.56386: stdout chunk (state=3): >>># cleanup[3] wiping os # cleanup[3] wiping os.path<<< 15627 1726882461.56410: stdout chunk (state=3): >>> # destroy genericpath # cleanup[3] wiping posixpath<<< 15627 1726882461.56440: stdout chunk (state=3): >>> # cleanup[3] wiping stat # cleanup[3] wiping _stat<<< 15627 1726882461.56480: stdout chunk (state=3): >>> # destroy _stat # cleanup[3] wiping io # destroy abc<<< 15627 1726882461.56504: stdout chunk (state=3): >>> # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 <<< 15627 1726882461.56534: stdout chunk (state=3): >>># cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases<<< 15627 1726882461.56562: stdout chunk (state=3): >>> # cleanup[3] wiping codecs # cleanup[3] wiping _codecs <<< 15627 1726882461.56588: stdout chunk (state=3): >>># cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 15627 1726882461.56620: stdout chunk (state=3): >>># cleanup[3] wiping marshal <<< 15627 1726882461.56648: stdout chunk (state=3): >>># cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings <<< 15627 1726882461.56672: stdout chunk (state=3): >>># cleanup[3] wiping _thread # cleanup[3] wiping _imp<<< 15627 1726882461.56700: stdout chunk (state=3): >>> # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 15627 1726882461.56719: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 15627 1726882461.56802: stdout chunk (state=3): >>># destroy gc<<< 15627 1726882461.57976: stdout chunk (state=3): >>> # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle <<< 15627 1726882461.57987: stdout chunk (state=3): >>># destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks <<< 15627 1726882461.58461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882461.58468: stderr chunk (state=3): >>><<< 15627 1726882461.58471: stdout chunk (state=3): >>><<< 15627 1726882461.58618: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67019b3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67019583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67019b3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67019b3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701958490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701958940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701958670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670190f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670190f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701932850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670190f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701970880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701908d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701932d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701958970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018d3eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018d6f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018cc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018d2640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018d3370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6701591dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015918b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701591eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701591f70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701591e80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018aed30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018a7610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018bb670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018dae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67015a3c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018ae250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67018bb280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018e09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a3fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a3d90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a3d00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701576370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701576460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015abfa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a5a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a5490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014aa1c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701561c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a5eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67018e0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014bcaf0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67014bce20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014ce730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014cec70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67014663a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014bcf10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6701477280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014ce5b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6701477340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a39d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67014926a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6701492970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701492760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6701492850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6701492ca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f670149f1f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67014928e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701486a30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67015a35b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6701492a90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f67013c1670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d047f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d95760> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d95640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d95370> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d95490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d95190> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d95400> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d957c0> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d6e7c0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d6eb50> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d6e9a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700c874f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d8ed30> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d95520> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d8e190> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700dbfa90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d62190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d62790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700c8dd00> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d626a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700de3d30> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700ce59a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700deee50> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700cf50d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700deee20> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700df5220> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700cf5100> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700db9b80> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700deeac0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700deed00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67013c1820> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700cf10d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700ce7370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700cf1d00> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700cf16a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700cf2130> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d2d8b0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d33910> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008a56a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d6c7f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008a9d90> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700d200a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670086e070> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d29160> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700d25cd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008a9bb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700624a60> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008836d0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700883af0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670086a250> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670086aa30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008b9460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008b9910> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67008b5d00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008b5d60> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008b5250> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f670068cf70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008c6b50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67008b9310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67005a4ca0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67005a4fd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6700598370> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f67005e7bb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f670051e160> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670051e2b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_vbtt5edz/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f67005728b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700572310> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700388850> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700572370> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670051eeb0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6700383310> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f670051a700> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.42, "5m": 0.35, "15m": 0.18}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2810, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 722, "free": 2810}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 619, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241455104, "block_size": 4096, "block_total": 65519355, "block_available": 64512074, "block_used": 1007281, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "21", "epoch": "1726882461", "epoch_int": "1726882461", "date": "2024-09-20", "time": "21:34:21", "iso8601_micro": "2024-09-21T01:34:21.456583Z", "iso8601": "2024-09-21T01:34:21Z", "iso8601_basic": "20240920T213421456583", "iso8601_basic_short": "20240920T213421", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 15627 1726882461.59752: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882460.3134267-15670-28312173966363/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882461.59758: _low_level_execute_command(): starting 15627 1726882461.59761: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882460.3134267-15670-28312173966363/ > /dev/null 2>&1 && sleep 0' 15627 1726882461.60343: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882461.61181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882461.61191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882461.61205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882461.61242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882461.61249: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882461.61259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882461.61275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882461.61282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882461.61288: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882461.61296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882461.61305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882461.61315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882461.61323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882461.61328: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882461.61338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882461.61408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882461.61427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882461.61440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882461.61568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882461.63842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882461.63847: stdout chunk (state=3): >>><<< 15627 1726882461.63852: stderr chunk (state=3): >>><<< 15627 1726882461.63900: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882461.63904: handler run complete 15627 1726882461.64029: variable 'ansible_facts' from source: unknown 15627 1726882461.64165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882461.64537: variable 'ansible_facts' from source: unknown 15627 1726882461.64657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882461.65017: attempt loop complete, returning result 15627 1726882461.65021: _execute() done 15627 1726882461.65023: dumping result to json 15627 1726882461.65619: done dumping result, returning 15627 1726882461.65646: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-2847-7723-00000000007e] 15627 1726882461.65668: sending task result for task 0e448fcc-3ce9-2847-7723-00000000007e ok: [managed_node1] 15627 1726882461.67940: no more pending results, returning what we have 15627 1726882461.67943: results queue empty 15627 1726882461.67944: checking for any_errors_fatal 15627 1726882461.67945: done checking for any_errors_fatal 15627 1726882461.67946: checking for max_fail_percentage 15627 1726882461.67948: done checking for max_fail_percentage 15627 1726882461.67948: checking to see if all hosts have failed and the running result is not ok 15627 1726882461.67949: done checking to see if all hosts have failed 15627 1726882461.67950: getting the remaining hosts for this loop 15627 1726882461.67952: done getting the remaining hosts for this loop 15627 1726882461.67958: getting the next task for host managed_node1 15627 1726882461.67971: done getting next task for host managed_node1 15627 1726882461.67972: ^ task is: TASK: meta (flush_handlers) 15627 1726882461.67975: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882461.67983: getting variables 15627 1726882461.67985: in VariableManager get_vars() 15627 1726882461.68010: Calling all_inventory to load vars for managed_node1 15627 1726882461.68013: Calling groups_inventory to load vars for managed_node1 15627 1726882461.68017: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882461.68032: Calling all_plugins_play to load vars for managed_node1 15627 1726882461.68035: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882461.68038: Calling groups_plugins_play to load vars for managed_node1 15627 1726882461.68231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882461.68680: done with get_vars() 15627 1726882461.68696: done getting variables 15627 1726882461.68823: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000007e 15627 1726882461.68827: WORKER PROCESS EXITING 15627 1726882461.68981: in VariableManager get_vars() 15627 1726882461.68992: Calling all_inventory to load vars for managed_node1 15627 1726882461.68994: Calling groups_inventory to load vars for managed_node1 15627 1726882461.68997: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882461.69001: Calling all_plugins_play to load vars for managed_node1 15627 1726882461.69004: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882461.69016: Calling groups_plugins_play to load vars for managed_node1 15627 1726882461.69202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882461.69410: done with get_vars() 15627 1726882461.69424: done queuing things up, now waiting for results queue to drain 15627 1726882461.69426: results queue empty 15627 1726882461.69427: checking for any_errors_fatal 15627 1726882461.69429: done checking for any_errors_fatal 15627 1726882461.69430: checking for max_fail_percentage 15627 1726882461.69431: done checking for max_fail_percentage 15627 1726882461.69432: checking to see if all hosts have failed and the running result is not ok 15627 1726882461.69432: done checking to see if all hosts have failed 15627 1726882461.69433: getting the remaining hosts for this loop 15627 1726882461.69434: done getting the remaining hosts for this loop 15627 1726882461.69436: getting the next task for host managed_node1 15627 1726882461.69445: done getting next task for host managed_node1 15627 1726882461.69447: ^ task is: TASK: Include the task 'el_repo_setup.yml' 15627 1726882461.69449: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882461.69451: getting variables 15627 1726882461.69452: in VariableManager get_vars() 15627 1726882461.69462: Calling all_inventory to load vars for managed_node1 15627 1726882461.69466: Calling groups_inventory to load vars for managed_node1 15627 1726882461.69469: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882461.69473: Calling all_plugins_play to load vars for managed_node1 15627 1726882461.69476: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882461.69478: Calling groups_plugins_play to load vars for managed_node1 15627 1726882461.69626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882461.69828: done with get_vars() 15627 1726882461.69836: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:11 Friday 20 September 2024 21:34:21 -0400 (0:00:01.438) 0:00:01.450 ****** 15627 1726882461.69917: entering _queue_task() for managed_node1/include_tasks 15627 1726882461.69919: Creating lock for include_tasks 15627 1726882461.70350: worker is 1 (out of 1 available) 15627 1726882461.70383: exiting _queue_task() for managed_node1/include_tasks 15627 1726882461.70394: done queuing things up, now waiting for results queue to drain 15627 1726882461.70397: waiting for pending results... 15627 1726882461.70669: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 15627 1726882461.70783: in run() - task 0e448fcc-3ce9-2847-7723-000000000006 15627 1726882461.70801: variable 'ansible_search_path' from source: unknown 15627 1726882461.70847: calling self._execute() 15627 1726882461.70930: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882461.70946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882461.70967: variable 'omit' from source: magic vars 15627 1726882461.71082: _execute() done 15627 1726882461.71090: dumping result to json 15627 1726882461.71098: done dumping result, returning 15627 1726882461.71109: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-2847-7723-000000000006] 15627 1726882461.71121: sending task result for task 0e448fcc-3ce9-2847-7723-000000000006 15627 1726882461.71246: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000006 15627 1726882461.71261: WORKER PROCESS EXITING 15627 1726882461.71305: no more pending results, returning what we have 15627 1726882461.71311: in VariableManager get_vars() 15627 1726882461.71346: Calling all_inventory to load vars for managed_node1 15627 1726882461.71349: Calling groups_inventory to load vars for managed_node1 15627 1726882461.71353: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882461.71372: Calling all_plugins_play to load vars for managed_node1 15627 1726882461.71376: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882461.71379: Calling groups_plugins_play to load vars for managed_node1 15627 1726882461.71778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882461.72095: done with get_vars() 15627 1726882461.72102: variable 'ansible_search_path' from source: unknown 15627 1726882461.72176: we have included files to process 15627 1726882461.72178: generating all_blocks data 15627 1726882461.72179: done generating all_blocks data 15627 1726882461.72180: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15627 1726882461.72181: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15627 1726882461.72184: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15627 1726882461.73566: in VariableManager get_vars() 15627 1726882461.73583: done with get_vars() 15627 1726882461.73596: done processing included file 15627 1726882461.73598: iterating over new_blocks loaded from include file 15627 1726882461.73600: in VariableManager get_vars() 15627 1726882461.73610: done with get_vars() 15627 1726882461.73611: filtering new block on tags 15627 1726882461.73627: done filtering new block on tags 15627 1726882461.73630: in VariableManager get_vars() 15627 1726882461.73646: done with get_vars() 15627 1726882461.73648: filtering new block on tags 15627 1726882461.73669: done filtering new block on tags 15627 1726882461.73694: in VariableManager get_vars() 15627 1726882461.73705: done with get_vars() 15627 1726882461.73706: filtering new block on tags 15627 1726882461.73720: done filtering new block on tags 15627 1726882461.73722: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 15627 1726882461.73727: extending task lists for all hosts with included blocks 15627 1726882461.73904: done extending task lists 15627 1726882461.73905: done processing included files 15627 1726882461.73906: results queue empty 15627 1726882461.73907: checking for any_errors_fatal 15627 1726882461.73908: done checking for any_errors_fatal 15627 1726882461.73909: checking for max_fail_percentage 15627 1726882461.73910: done checking for max_fail_percentage 15627 1726882461.73911: checking to see if all hosts have failed and the running result is not ok 15627 1726882461.73911: done checking to see if all hosts have failed 15627 1726882461.73912: getting the remaining hosts for this loop 15627 1726882461.73913: done getting the remaining hosts for this loop 15627 1726882461.73916: getting the next task for host managed_node1 15627 1726882461.73920: done getting next task for host managed_node1 15627 1726882461.73926: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 15627 1726882461.73929: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882461.73931: getting variables 15627 1726882461.73932: in VariableManager get_vars() 15627 1726882461.73940: Calling all_inventory to load vars for managed_node1 15627 1726882461.73942: Calling groups_inventory to load vars for managed_node1 15627 1726882461.73944: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882461.73949: Calling all_plugins_play to load vars for managed_node1 15627 1726882461.73952: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882461.73957: Calling groups_plugins_play to load vars for managed_node1 15627 1726882461.74357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882461.74746: done with get_vars() 15627 1726882461.74758: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:34:21 -0400 (0:00:00.049) 0:00:01.499 ****** 15627 1726882461.74831: entering _queue_task() for managed_node1/setup 15627 1726882461.75158: worker is 1 (out of 1 available) 15627 1726882461.75176: exiting _queue_task() for managed_node1/setup 15627 1726882461.75187: done queuing things up, now waiting for results queue to drain 15627 1726882461.75189: waiting for pending results... 15627 1726882461.75453: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 15627 1726882461.75580: in run() - task 0e448fcc-3ce9-2847-7723-00000000008f 15627 1726882461.75602: variable 'ansible_search_path' from source: unknown 15627 1726882461.75615: variable 'ansible_search_path' from source: unknown 15627 1726882461.75660: calling self._execute() 15627 1726882461.75742: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882461.75757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882461.75777: variable 'omit' from source: magic vars 15627 1726882461.76409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882461.79237: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882461.79335: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882461.79380: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882461.79430: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882461.79471: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882461.79569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882461.79605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882461.79640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882461.79698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882461.79720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882461.79925: variable 'ansible_facts' from source: unknown 15627 1726882461.80011: variable 'network_test_required_facts' from source: task vars 15627 1726882461.80053: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 15627 1726882461.80069: variable 'omit' from source: magic vars 15627 1726882461.80124: variable 'omit' from source: magic vars 15627 1726882461.80160: variable 'omit' from source: magic vars 15627 1726882461.80197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882461.80231: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882461.80251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882461.80276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882461.80289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882461.80333: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882461.80341: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882461.80348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882461.80458: Set connection var ansible_timeout to 10 15627 1726882461.80474: Set connection var ansible_shell_executable to /bin/sh 15627 1726882461.80487: Set connection var ansible_connection to ssh 15627 1726882461.80498: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882461.80507: Set connection var ansible_pipelining to False 15627 1726882461.80518: Set connection var ansible_shell_type to sh 15627 1726882461.80566: variable 'ansible_shell_executable' from source: unknown 15627 1726882461.80574: variable 'ansible_connection' from source: unknown 15627 1726882461.80581: variable 'ansible_module_compression' from source: unknown 15627 1726882461.80588: variable 'ansible_shell_type' from source: unknown 15627 1726882461.80593: variable 'ansible_shell_executable' from source: unknown 15627 1726882461.80603: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882461.80612: variable 'ansible_pipelining' from source: unknown 15627 1726882461.80619: variable 'ansible_timeout' from source: unknown 15627 1726882461.80630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882461.80795: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882461.80809: variable 'omit' from source: magic vars 15627 1726882461.80817: starting attempt loop 15627 1726882461.80827: running the handler 15627 1726882461.80851: _low_level_execute_command(): starting 15627 1726882461.80873: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882461.81730: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882461.81762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882461.81781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882461.81798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882461.81843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882461.81861: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882461.81882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882461.81917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882461.81929: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882461.81938: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882461.81953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882461.82173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882461.82190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882461.82202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882461.82211: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882461.82223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882461.82311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882461.82328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882461.82341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882461.82614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882461.84849: stdout chunk (state=3): >>>/root <<< 15627 1726882461.85012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882461.85066: stderr chunk (state=3): >>><<< 15627 1726882461.85070: stdout chunk (state=3): >>><<< 15627 1726882461.85090: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15627 1726882461.85100: _low_level_execute_command(): starting 15627 1726882461.85108: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882461.8509042-15720-73813528656547 `" && echo ansible-tmp-1726882461.8509042-15720-73813528656547="` echo /root/.ansible/tmp/ansible-tmp-1726882461.8509042-15720-73813528656547 `" ) && sleep 0' 15627 1726882461.85786: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882461.85800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882461.85815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882461.85832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882461.85884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882461.85896: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882461.85910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882461.85926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882461.85938: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882461.85948: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882461.85967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882461.85987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882461.86006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882461.86037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882461.86057: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882461.86075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882461.86221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882461.86244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882461.86273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882461.86412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882461.89106: stdout chunk (state=3): >>>ansible-tmp-1726882461.8509042-15720-73813528656547=/root/.ansible/tmp/ansible-tmp-1726882461.8509042-15720-73813528656547 <<< 15627 1726882461.89266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882461.89325: stderr chunk (state=3): >>><<< 15627 1726882461.89328: stdout chunk (state=3): >>><<< 15627 1726882461.89369: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882461.8509042-15720-73813528656547=/root/.ansible/tmp/ansible-tmp-1726882461.8509042-15720-73813528656547 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15627 1726882461.89384: variable 'ansible_module_compression' from source: unknown 15627 1726882461.89425: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15627 1726882461.89473: variable 'ansible_facts' from source: unknown 15627 1726882461.89603: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882461.8509042-15720-73813528656547/AnsiballZ_setup.py 15627 1726882461.89707: Sending initial data 15627 1726882461.89716: Sent initial data (153 bytes) 15627 1726882461.90492: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882461.90508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882461.90523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882461.90542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882461.90589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882461.90606: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882461.90622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882461.90643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882461.90656: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882461.90673: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882461.90687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882461.90701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882461.90719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882461.90730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882461.90747: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882461.90762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882461.90841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882461.90858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882461.90875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882461.91024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882461.93086: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882461.93153: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882461.93243: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpgr7py9pm /root/.ansible/tmp/ansible-tmp-1726882461.8509042-15720-73813528656547/AnsiballZ_setup.py <<< 15627 1726882461.93373: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882461.95322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882461.95471: stderr chunk (state=3): >>><<< 15627 1726882461.95475: stdout chunk (state=3): >>><<< 15627 1726882461.95478: done transferring module to remote 15627 1726882461.95505: _low_level_execute_command(): starting 15627 1726882461.95508: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882461.8509042-15720-73813528656547/ /root/.ansible/tmp/ansible-tmp-1726882461.8509042-15720-73813528656547/AnsiballZ_setup.py && sleep 0' 15627 1726882461.96118: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882461.96132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882461.96147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882461.96171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882461.96217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882461.96230: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882461.96245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882461.96269: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882461.96282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882461.96299: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882461.96312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882461.96326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882461.96343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882461.96355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882461.96370: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882461.96388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882461.96471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882461.96491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882461.96506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882461.96630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882461.98990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882461.99038: stderr chunk (state=3): >>><<< 15627 1726882461.99041: stdout chunk (state=3): >>><<< 15627 1726882461.99062: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882461.99068: _low_level_execute_command(): starting 15627 1726882461.99072: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882461.8509042-15720-73813528656547/AnsiballZ_setup.py && sleep 0' 15627 1726882461.99687: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882461.99736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882461.99750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882461.99912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882462.01871: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15627 1726882462.01920: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 15627 1726882462.01969: stdout chunk (state=3): >>>import 'posix' # <<< 15627 1726882462.02032: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15627 1726882462.02035: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 15627 1726882462.02123: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882462.02146: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 15627 1726882462.02149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec98dc0> <<< 15627 1726882462.02241: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 15627 1726882462.02246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec98b20> <<< 15627 1726882462.02285: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec98ac0> import '_signal' # <<< 15627 1726882462.02321: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec3d490> <<< 15627 1726882462.02387: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 15627 1726882462.02402: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec3d940> <<< 15627 1726882462.02419: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec3d670> <<< 15627 1726882462.02436: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 15627 1726882462.02495: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 15627 1726882462.02514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 15627 1726882462.02568: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9cf190> <<< 15627 1726882462.02573: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 15627 1726882462.02587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 15627 1726882462.02667: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9cf220> <<< 15627 1726882462.02716: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 15627 1726882462.02766: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9cf940> <<< 15627 1726882462.02770: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec55880> <<< 15627 1726882462.02782: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9c8d90> <<< 15627 1726882462.02841: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9f2d90> <<< 15627 1726882462.02886: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec3d970> <<< 15627 1726882462.02924: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15627 1726882462.03282: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 15627 1726882462.03313: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 15627 1726882462.03342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 15627 1726882462.03367: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce993eb0> <<< 15627 1726882462.03417: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce996f40> <<< 15627 1726882462.03448: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 15627 1726882462.03490: stdout chunk (state=3): >>>import '_sre' # <<< 15627 1726882462.03524: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 15627 1726882462.03569: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce98c610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce992640> <<< 15627 1726882462.03586: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce993370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 15627 1726882462.03665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 15627 1726882462.03713: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 15627 1726882462.03757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 15627 1726882462.03795: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce878e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce878910> import 'itertools' # <<< 15627 1726882462.03845: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce878f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 15627 1726882462.03888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce878fd0> <<< 15627 1726882462.03910: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88b0d0> import '_collections' # <<< 15627 1726882462.03952: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce96ed90> import '_functools' # <<< 15627 1726882462.03977: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce967670> <<< 15627 1726882462.04069: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce97a6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce99ae20> <<< 15627 1726882462.04102: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce88bcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce96e2b0> <<< 15627 1726882462.04167: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce97a2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9a09d0> <<< 15627 1726882462.04196: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882462.04251: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88beb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88bdf0> <<< 15627 1726882462.04306: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88bd60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 15627 1726882462.04335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 15627 1726882462.04359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 15627 1726882462.04374: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 15627 1726882462.04414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 15627 1726882462.04446: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce85e3d0> <<< 15627 1726882462.04473: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 15627 1726882462.04499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 15627 1726882462.04511: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce85e4c0> <<< 15627 1726882462.04628: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce892f40> <<< 15627 1726882462.04668: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88da90> <<< 15627 1726882462.04702: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88d490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 15627 1726882462.04724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 15627 1726882462.04776: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 15627 1726882462.04787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce792220> <<< 15627 1726882462.04821: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce849520> <<< 15627 1726882462.04878: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88df10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9a0040> <<< 15627 1726882462.04890: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 15627 1726882462.04944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 15627 1726882462.04968: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce7a4b50> import 'errno' # <<< 15627 1726882462.05004: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce7a4e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 15627 1726882462.05043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 15627 1726882462.05067: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce7b5790> <<< 15627 1726882462.05081: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 15627 1726882462.05100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 15627 1726882462.05126: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce7b5cd0> <<< 15627 1726882462.05185: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce743400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce7a4f70> <<< 15627 1726882462.05197: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 15627 1726882462.05244: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce7542e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce7b5610> <<< 15627 1726882462.05277: stdout chunk (state=3): >>>import 'pwd' # <<< 15627 1726882462.05288: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce7543a0> <<< 15627 1726882462.05328: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88ba30> <<< 15627 1726882462.05340: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 15627 1726882462.05367: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 15627 1726882462.05393: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 15627 1726882462.05435: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce76f700> <<< 15627 1726882462.05471: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 15627 1726882462.05504: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce76f9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce76f7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce76f8b0> <<< 15627 1726882462.05528: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 15627 1726882462.05732: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce76fd00> <<< 15627 1726882462.05773: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce77a250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce76f940> <<< 15627 1726882462.05796: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce763a90> <<< 15627 1726882462.05814: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88b610> <<< 15627 1726882462.05823: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 15627 1726882462.05889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 15627 1726882462.05916: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce76faf0> <<< 15627 1726882462.06074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f23ce69d6d0> <<< 15627 1726882462.06332: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip' # zipimport: zlib available <<< 15627 1726882462.06434: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.06483: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 15627 1726882462.06501: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 15627 1726882462.07751: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.08708: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce531820> <<< 15627 1726882462.08762: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 15627 1726882462.08795: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 15627 1726882462.08807: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5c0730> <<< 15627 1726882462.08833: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5c0610> <<< 15627 1726882462.08888: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5c0340> <<< 15627 1726882462.08904: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 15627 1726882462.08936: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5c0460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5c0160> <<< 15627 1726882462.08946: stdout chunk (state=3): >>>import 'atexit' # <<< 15627 1726882462.08987: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5c03a0> <<< 15627 1726882462.09012: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 15627 1726882462.09024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 15627 1726882462.09052: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5c0790> <<< 15627 1726882462.09086: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 15627 1726882462.09123: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 15627 1726882462.09146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 15627 1726882462.09160: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 15627 1726882462.09238: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5b0820> <<< 15627 1726882462.09276: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5b0490> <<< 15627 1726882462.09329: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5b0640> <<< 15627 1726882462.09347: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 15627 1726882462.09380: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdf88520> <<< 15627 1726882462.09392: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5bbd60> <<< 15627 1726882462.09575: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5c04f0> <<< 15627 1726882462.09597: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5bb1c0> <<< 15627 1726882462.09626: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 15627 1726882462.09687: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 15627 1726882462.09726: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 15627 1726882462.09741: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5bfb20> <<< 15627 1726882462.09811: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce58f160> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce58f760> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdf8ed30> <<< 15627 1726882462.09845: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce58f670> <<< 15627 1726882462.09884: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce611d00> <<< 15627 1726882462.09917: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 15627 1726882462.09929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 15627 1726882462.09969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 15627 1726882462.10034: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdfe5a00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce61be80> <<< 15627 1726882462.10065: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 15627 1726882462.10119: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdff30a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce61beb0> <<< 15627 1726882462.10149: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 15627 1726882462.10212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882462.10226: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 15627 1726882462.10275: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce623250> <<< 15627 1726882462.10405: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdff30d0> <<< 15627 1726882462.10494: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce623a60> <<< 15627 1726882462.10532: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5e5b80> <<< 15627 1726882462.10572: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce61bcd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce611ee0> <<< 15627 1726882462.10612: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 15627 1726882462.10638: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 15627 1726882462.10684: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdfef0d0> <<< 15627 1726882462.10874: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdfe6310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdfefcd0> <<< 15627 1726882462.10916: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdfef670> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdff0100> <<< 15627 1726882462.10948: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 15627 1726882462.10968: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.11036: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.11114: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.11159: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 15627 1726882462.11172: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.11270: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.11366: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.11826: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.12299: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 15627 1726882462.12341: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882462.12393: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce55c910> <<< 15627 1726882462.12484: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5619a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdb86640> <<< 15627 1726882462.12548: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 15627 1726882462.12574: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.12582: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/_text.py <<< 15627 1726882462.12585: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.12702: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.13151: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5977f0> # zipimport: zlib available <<< 15627 1726882462.13514: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.14140: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.14149: stdout chunk (state=3): >>> <<< 15627 1726882462.14234: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.14239: stdout chunk (state=3): >>> <<< 15627 1726882462.14333: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/collections.py<<< 15627 1726882462.14350: stdout chunk (state=3): >>> <<< 15627 1726882462.14353: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.14364: stdout chunk (state=3): >>> <<< 15627 1726882462.14404: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.14410: stdout chunk (state=3): >>> <<< 15627 1726882462.14461: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py<<< 15627 1726882462.14483: stdout chunk (state=3): >>> <<< 15627 1726882462.14486: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.14488: stdout chunk (state=3): >>> <<< 15627 1726882462.14580: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.14588: stdout chunk (state=3): >>> <<< 15627 1726882462.14696: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/errors.py<<< 15627 1726882462.14702: stdout chunk (state=3): >>> <<< 15627 1726882462.14728: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.14749: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.14776: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 15627 1726882462.14789: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.14850: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.14901: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 15627 1726882462.14932: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.15238: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.15518: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 15627 1726882462.15566: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 15627 1726882462.15673: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5dd460> <<< 15627 1726882462.15678: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.15709: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.15797: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 15627 1726882462.15819: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.16432: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5510d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5611f0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 15627 1726882462.16802: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 15627 1726882462.16825: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce563bb0> <<< 15627 1726882462.16850: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce62c070> <<< 15627 1726882462.16933: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5542e0> <<< 15627 1726882462.16969: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 15627 1726882462.17006: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 15627 1726882462.17074: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 15627 1726882462.17117: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 15627 1726882462.17121: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17180: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17242: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17281: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17284: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17311: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17357: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17394: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17419: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available <<< 15627 1726882462.17492: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17556: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17597: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17612: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 15627 1726882462.17751: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17897: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17920: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.17977: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882462.18015: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 15627 1726882462.18031: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 15627 1726882462.18061: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdb39400> <<< 15627 1726882462.18092: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 15627 1726882462.18105: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 15627 1726882462.18161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 15627 1726882462.18177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdb989a0> <<< 15627 1726882462.18214: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdb98df0> <<< 15627 1726882462.18292: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdb96490> <<< 15627 1726882462.18295: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cda11040> <<< 15627 1726882462.18320: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd9013a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd9015e0> <<< 15627 1726882462.18365: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 15627 1726882462.18390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 15627 1726882462.18393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 15627 1726882462.18452: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5506d0> <<< 15627 1726882462.18457: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdba6730> <<< 15627 1726882462.18460: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 15627 1726882462.18517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 15627 1726882462.18527: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5505e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 15627 1726882462.18539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 15627 1726882462.18576: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdb523a0> <<< 15627 1726882462.19151: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd9609a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd9014f0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 15627 1726882462.19163: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.19372: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15627 1726882462.19432: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.19496: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py <<< 15627 1726882462.19511: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 15627 1726882462.20233: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.20238: stdout chunk (state=3): >>> <<< 15627 1726882462.20859: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 15627 1726882462.20887: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.20972: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.21055: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.21067: stdout chunk (state=3): >>> <<< 15627 1726882462.21107: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.21161: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py<<< 15627 1726882462.21174: stdout chunk (state=3): >>> <<< 15627 1726882462.21198: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 15627 1726882462.21209: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.21242: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.21268: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 15627 1726882462.21332: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.21389: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 15627 1726882462.21400: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.21411: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.21437: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 15627 1726882462.21471: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.21500: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 15627 1726882462.21571: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.21646: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 15627 1726882462.21675: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd9019d0> <<< 15627 1726882462.21695: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 15627 1726882462.21720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 15627 1726882462.21881: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd880f40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 15627 1726882462.21940: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.21999: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 15627 1726882462.22098: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.22156: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 15627 1726882462.22179: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.22229: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.22716: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cd8783a0> <<< 15627 1726882462.23039: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd8c6100> <<< 15627 1726882462.23065: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 15627 1726882462.23090: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.23174: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.23247: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 15627 1726882462.23277: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.23388: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.23506: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.23673: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.23888: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/compat/version.py <<< 15627 1726882462.23910: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 15627 1726882462.23936: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.24001: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.24062: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 15627 1726882462.24092: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.24149: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.24156: stdout chunk (state=3): >>> <<< 15627 1726882462.24221: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py<<< 15627 1726882462.24241: stdout chunk (state=3): >>> <<< 15627 1726882462.24265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc'<<< 15627 1726882462.24270: stdout chunk (state=3): >>> <<< 15627 1726882462.24328: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882462.24346: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882462.24365: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cd80c6a0> <<< 15627 1726882462.24380: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd80ca90><<< 15627 1726882462.24395: stdout chunk (state=3): >>> <<< 15627 1726882462.24405: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py<<< 15627 1726882462.24410: stdout chunk (state=3): >>> <<< 15627 1726882462.24432: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.24442: stdout chunk (state=3): >>> <<< 15627 1726882462.24470: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.24493: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py<<< 15627 1726882462.24498: stdout chunk (state=3): >>> <<< 15627 1726882462.24526: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.24584: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.24589: stdout chunk (state=3): >>> <<< 15627 1726882462.24644: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 15627 1726882462.24676: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.24903: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.24908: stdout chunk (state=3): >>> <<< 15627 1726882462.25114: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 15627 1726882462.25139: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.25279: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.25287: stdout chunk (state=3): >>> <<< 15627 1726882462.25420: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.25425: stdout chunk (state=3): >>> <<< 15627 1726882462.25481: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.25538: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 15627 1726882462.25567: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 15627 1726882462.25593: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.25597: stdout chunk (state=3): >>> <<< 15627 1726882462.25732: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.25734: stdout chunk (state=3): >>> <<< 15627 1726882462.25770: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.25961: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.25967: stdout chunk (state=3): >>> <<< 15627 1726882462.26155: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 15627 1726882462.26178: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py<<< 15627 1726882462.26183: stdout chunk (state=3): >>> <<< 15627 1726882462.26205: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.26375: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.26384: stdout chunk (state=3): >>> <<< 15627 1726882462.26553: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py<<< 15627 1726882462.26560: stdout chunk (state=3): >>> <<< 15627 1726882462.26579: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.26626: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.26631: stdout chunk (state=3): >>> <<< 15627 1726882462.26682: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.27433: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.28136: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py <<< 15627 1726882462.28166: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 15627 1726882462.28193: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.28334: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.28339: stdout chunk (state=3): >>> <<< 15627 1726882462.28488: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 15627 1726882462.28512: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.28647: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.28788: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py<<< 15627 1726882462.28793: stdout chunk (state=3): >>> <<< 15627 1726882462.28813: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.28819: stdout chunk (state=3): >>> <<< 15627 1726882462.29019: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.29026: stdout chunk (state=3): >>> <<< 15627 1726882462.29233: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 15627 1726882462.29265: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.29290: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.29314: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 15627 1726882462.29345: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.29410: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.29415: stdout chunk (state=3): >>> <<< 15627 1726882462.29474: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py<<< 15627 1726882462.29484: stdout chunk (state=3): >>> <<< 15627 1726882462.29500: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.29505: stdout chunk (state=3): >>> <<< 15627 1726882462.29608: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.29751: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.29757: stdout chunk (state=3): >>> <<< 15627 1726882462.30057: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.30339: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py <<< 15627 1726882462.30365: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 15627 1726882462.30389: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.30441: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.30448: stdout chunk (state=3): >>> <<< 15627 1726882462.30497: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py<<< 15627 1726882462.30502: stdout chunk (state=3): >>> <<< 15627 1726882462.30526: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.30568: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.30607: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py<<< 15627 1726882462.30615: stdout chunk (state=3): >>> <<< 15627 1726882462.30632: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.30637: stdout chunk (state=3): >>> <<< 15627 1726882462.30733: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.30738: stdout chunk (state=3): >>> <<< 15627 1726882462.30833: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py<<< 15627 1726882462.30838: stdout chunk (state=3): >>> <<< 15627 1726882462.30867: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.30903: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.30941: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 15627 1726882462.30970: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.31045: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.31127: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 15627 1726882462.31155: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.31234: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.31316: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 15627 1726882462.31343: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.31715: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.32082: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py<<< 15627 1726882462.32092: stdout chunk (state=3): >>> <<< 15627 1726882462.32109: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.32115: stdout chunk (state=3): >>> <<< 15627 1726882462.32189: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.32274: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 15627 1726882462.32298: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.32303: stdout chunk (state=3): >>> <<< 15627 1726882462.32348: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.32353: stdout chunk (state=3): >>> <<< 15627 1726882462.32400: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 15627 1726882462.32427: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.32474: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.32481: stdout chunk (state=3): >>> <<< 15627 1726882462.32524: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 15627 1726882462.32551: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.32556: stdout chunk (state=3): >>> <<< 15627 1726882462.32602: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.32651: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py<<< 15627 1726882462.32660: stdout chunk (state=3): >>> <<< 15627 1726882462.32684: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.32792: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.33083: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 15627 1726882462.33104: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.33136: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 15627 1726882462.33165: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.33194: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.33199: stdout chunk (state=3): >>> <<< 15627 1726882462.33230: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.33300: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.33305: stdout chunk (state=3): >>> <<< 15627 1726882462.33378: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.33476: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.33576: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 15627 1726882462.33604: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py <<< 15627 1726882462.33621: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 15627 1726882462.33644: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.33649: stdout chunk (state=3): >>> <<< 15627 1726882462.33712: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.33784: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py<<< 15627 1726882462.33789: stdout chunk (state=3): >>> <<< 15627 1726882462.33809: stdout chunk (state=3): >>># zipimport: zlib available<<< 15627 1726882462.33814: stdout chunk (state=3): >>> <<< 15627 1726882462.34073: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.34338: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 15627 1726882462.34367: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.34434: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.34530: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 15627 1726882462.34534: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.34601: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.34724: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 15627 1726882462.34805: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.34934: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 15627 1726882462.34960: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 15627 1726882462.34974: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.35084: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.35199: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 15627 1726882462.35231: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py <<< 15627 1726882462.35246: stdout chunk (state=3): >>>import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 15627 1726882462.35366: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882462.35659: stdout chunk (state=3): >>>import 'gc' # <<< 15627 1726882462.36134: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 15627 1726882462.36160: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 15627 1726882462.36175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 15627 1726882462.36228: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cd7ee040> <<< 15627 1726882462.36231: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd7f0100> <<< 15627 1726882462.36324: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd7f0ca0> <<< 15627 1726882462.38758: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71Q<<< 15627 1726882462.38795: stdout chunk (state=3): >>>U/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "22", "epoch": "1726882462", "epoch_int": "1726882462", "date": "2024-09-20", "time": "21:34:22", "iso8601_micro": "2024-09-21T01:34:22.376662Z", "iso8601": "2024-09-21T01:34:22Z", "iso8601_basic": "20240920T213422376662", "iso8601_basic_short": "20240920T213422", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15627 1726882462.39433: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 15627 1726882462.39445: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix <<< 15627 1726882462.39455: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants <<< 15627 1726882462.39489: stdout chunk (state=3): >>># destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math <<< 15627 1726882462.39493: stdout chunk (state=3): >>># cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string <<< 15627 1726882462.39514: stdout chunk (state=3): >>># cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text <<< 15627 1726882462.39532: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections <<< 15627 1726882462.39549: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 15627 1726882462.39567: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux <<< 15627 1726882462.39587: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn <<< 15627 1726882462.39591: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys <<< 15627 1726882462.39597: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 15627 1726882462.39961: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15627 1726882462.39966: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 15627 1726882462.40062: stdout chunk (state=3): >>># destroy zipimport <<< 15627 1726882462.40065: stdout chunk (state=3): >>># destroy _compression <<< 15627 1726882462.40068: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 15627 1726882462.40093: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 15627 1726882462.40096: stdout chunk (state=3): >>># destroy _json # destroy encodings <<< 15627 1726882462.40098: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 15627 1726882462.40149: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 15627 1726882462.40258: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing <<< 15627 1726882462.40261: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 15627 1726882462.40280: stdout chunk (state=3): >>># destroy shlex <<< 15627 1726882462.40315: stdout chunk (state=3): >>># destroy datetime # destroy base64 <<< 15627 1726882462.40318: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 15627 1726882462.40369: stdout chunk (state=3): >>># destroy json <<< 15627 1726882462.40392: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 15627 1726882462.40493: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 15627 1726882462.40589: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections <<< 15627 1726882462.40605: stdout chunk (state=3): >>># cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 15627 1726882462.40652: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl <<< 15627 1726882462.40676: stdout chunk (state=3): >>># destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 15627 1726882462.40904: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 15627 1726882462.40937: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 15627 1726882462.40959: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 15627 1726882462.41015: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 15627 1726882462.41436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882462.41519: stderr chunk (state=3): >>><<< 15627 1726882462.41522: stdout chunk (state=3): >>><<< 15627 1726882462.41700: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec98dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec98b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec98ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec3d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec3d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec3d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec55880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cec3d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce993eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce996f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce98c610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce992640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce993370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce878e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce878910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce878f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce878fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88b0d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce96ed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce967670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce97a6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce99ae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce88bcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce96e2b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce97a2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9a09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88beb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88bdf0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88bd60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce85e3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce85e4c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce892f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88da90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88d490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce792220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce849520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88df10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce9a0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce7a4b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce7a4e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce7b5790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce7b5cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce743400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce7a4f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce7542e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce7b5610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce7543a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88ba30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce76f700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce76f9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce76f7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce76f8b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce76fd00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce77a250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce76f940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce763a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce88b610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce76faf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f23ce69d6d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce531820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5c0730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5c0610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5c0340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5c0460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5c0160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5c03a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5c0790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5b0820> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5b0490> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5b0640> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdf88520> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5bbd60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5c04f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5bb1c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5bfb20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce58f160> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce58f760> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdf8ed30> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce58f670> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce611d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdfe5a00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce61be80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdff30a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce61beb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce623250> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdff30d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce623a60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5e5b80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce61bcd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce611ee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdfef0d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdfe6310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdfefcd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdfef670> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdff0100> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce55c910> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5619a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdb86640> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5977f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5dd460> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5510d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5611f0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce563bb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce62c070> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5542e0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdb39400> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdb989a0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdb98df0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdb96490> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cda11040> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd9013a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd9015e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23ce5506d0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cdba6730> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23ce5505e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cdb523a0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd9609a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd9014f0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd9019d0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd880f40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cd8783a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd8c6100> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cd80c6a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd80ca90> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_s9370yhf/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f23cd7ee040> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd7f0100> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f23cd7f0ca0> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "22", "epoch": "1726882462", "epoch_int": "1726882462", "date": "2024-09-20", "time": "21:34:22", "iso8601_micro": "2024-09-21T01:34:22.376662Z", "iso8601": "2024-09-21T01:34:22Z", "iso8601_basic": "20240920T213422376662", "iso8601_basic_short": "20240920T213422", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 15627 1726882462.42993: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882461.8509042-15720-73813528656547/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882462.42996: _low_level_execute_command(): starting 15627 1726882462.42999: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882461.8509042-15720-73813528656547/ > /dev/null 2>&1 && sleep 0' 15627 1726882462.43680: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882462.43697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882462.43714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882462.43730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882462.43780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882462.43791: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882462.43807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882462.43827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882462.43837: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882462.43885: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882462.43899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882462.43916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882462.43936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882462.43947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882462.43962: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882462.43979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882462.44061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882462.44089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882462.44104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882462.44234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882462.46305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882462.46387: stderr chunk (state=3): >>><<< 15627 1726882462.46392: stdout chunk (state=3): >>><<< 15627 1726882462.46416: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882462.46422: handler run complete 15627 1726882462.46490: variable 'ansible_facts' from source: unknown 15627 1726882462.46542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882462.46668: variable 'ansible_facts' from source: unknown 15627 1726882462.46720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882462.46777: attempt loop complete, returning result 15627 1726882462.46781: _execute() done 15627 1726882462.46783: dumping result to json 15627 1726882462.46799: done dumping result, returning 15627 1726882462.46808: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-2847-7723-00000000008f] 15627 1726882462.46812: sending task result for task 0e448fcc-3ce9-2847-7723-00000000008f 15627 1726882462.46974: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000008f 15627 1726882462.46976: WORKER PROCESS EXITING ok: [managed_node1] 15627 1726882462.47109: no more pending results, returning what we have 15627 1726882462.47112: results queue empty 15627 1726882462.47112: checking for any_errors_fatal 15627 1726882462.47114: done checking for any_errors_fatal 15627 1726882462.47114: checking for max_fail_percentage 15627 1726882462.47116: done checking for max_fail_percentage 15627 1726882462.47116: checking to see if all hosts have failed and the running result is not ok 15627 1726882462.47117: done checking to see if all hosts have failed 15627 1726882462.47118: getting the remaining hosts for this loop 15627 1726882462.47119: done getting the remaining hosts for this loop 15627 1726882462.47122: getting the next task for host managed_node1 15627 1726882462.47131: done getting next task for host managed_node1 15627 1726882462.47133: ^ task is: TASK: Check if system is ostree 15627 1726882462.47135: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882462.47138: getting variables 15627 1726882462.47139: in VariableManager get_vars() 15627 1726882462.47158: Calling all_inventory to load vars for managed_node1 15627 1726882462.47161: Calling groups_inventory to load vars for managed_node1 15627 1726882462.47165: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882462.47175: Calling all_plugins_play to load vars for managed_node1 15627 1726882462.47177: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882462.47179: Calling groups_plugins_play to load vars for managed_node1 15627 1726882462.47334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882462.47545: done with get_vars() 15627 1726882462.47555: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:34:22 -0400 (0:00:00.728) 0:00:02.228 ****** 15627 1726882462.47652: entering _queue_task() for managed_node1/stat 15627 1726882462.47900: worker is 1 (out of 1 available) 15627 1726882462.47914: exiting _queue_task() for managed_node1/stat 15627 1726882462.47924: done queuing things up, now waiting for results queue to drain 15627 1726882462.47926: waiting for pending results... 15627 1726882462.48254: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 15627 1726882462.48365: in run() - task 0e448fcc-3ce9-2847-7723-000000000091 15627 1726882462.48381: variable 'ansible_search_path' from source: unknown 15627 1726882462.48395: variable 'ansible_search_path' from source: unknown 15627 1726882462.48431: calling self._execute() 15627 1726882462.48515: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882462.48526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882462.48543: variable 'omit' from source: magic vars 15627 1726882462.48995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882462.49281: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882462.49333: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882462.49377: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882462.49415: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882462.49506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882462.49534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882462.49562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882462.49601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882462.49725: Evaluated conditional (not __network_is_ostree is defined): True 15627 1726882462.49736: variable 'omit' from source: magic vars 15627 1726882462.49778: variable 'omit' from source: magic vars 15627 1726882462.49827: variable 'omit' from source: magic vars 15627 1726882462.49857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882462.49892: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882462.49918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882462.49941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882462.49955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882462.49986: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882462.49996: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882462.50003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882462.50111: Set connection var ansible_timeout to 10 15627 1726882462.50123: Set connection var ansible_shell_executable to /bin/sh 15627 1726882462.50138: Set connection var ansible_connection to ssh 15627 1726882462.50149: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882462.50157: Set connection var ansible_pipelining to False 15627 1726882462.50163: Set connection var ansible_shell_type to sh 15627 1726882462.50191: variable 'ansible_shell_executable' from source: unknown 15627 1726882462.50198: variable 'ansible_connection' from source: unknown 15627 1726882462.50205: variable 'ansible_module_compression' from source: unknown 15627 1726882462.50211: variable 'ansible_shell_type' from source: unknown 15627 1726882462.50217: variable 'ansible_shell_executable' from source: unknown 15627 1726882462.50223: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882462.50230: variable 'ansible_pipelining' from source: unknown 15627 1726882462.50241: variable 'ansible_timeout' from source: unknown 15627 1726882462.50254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882462.50400: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882462.50414: variable 'omit' from source: magic vars 15627 1726882462.50423: starting attempt loop 15627 1726882462.50429: running the handler 15627 1726882462.50445: _low_level_execute_command(): starting 15627 1726882462.50465: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882462.51224: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882462.51241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882462.51255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882462.51275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882462.51317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882462.51332: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882462.51349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882462.51371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882462.51384: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882462.51396: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882462.51409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882462.51425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882462.51445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882462.51462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882462.51478: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882462.51492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882462.51575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882462.51597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882462.51612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882462.51737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882462.53693: stdout chunk (state=3): >>>/root <<< 15627 1726882462.53932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882462.53935: stdout chunk (state=3): >>><<< 15627 1726882462.53937: stderr chunk (state=3): >>><<< 15627 1726882462.53968: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882462.54062: _low_level_execute_command(): starting 15627 1726882462.54067: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882462.5396156-15750-19432718362605 `" && echo ansible-tmp-1726882462.5396156-15750-19432718362605="` echo /root/.ansible/tmp/ansible-tmp-1726882462.5396156-15750-19432718362605 `" ) && sleep 0' 15627 1726882462.54924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882462.54928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882462.54956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882462.54960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882462.54963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882462.54967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882462.55036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882462.55040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882462.55044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882462.55140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882462.57639: stdout chunk (state=3): >>>ansible-tmp-1726882462.5396156-15750-19432718362605=/root/.ansible/tmp/ansible-tmp-1726882462.5396156-15750-19432718362605 <<< 15627 1726882462.57794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882462.57865: stderr chunk (state=3): >>><<< 15627 1726882462.57869: stdout chunk (state=3): >>><<< 15627 1726882462.58088: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882462.5396156-15750-19432718362605=/root/.ansible/tmp/ansible-tmp-1726882462.5396156-15750-19432718362605 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882462.58092: variable 'ansible_module_compression' from source: unknown 15627 1726882462.58094: ANSIBALLZ: Using lock for stat 15627 1726882462.58096: ANSIBALLZ: Acquiring lock 15627 1726882462.58098: ANSIBALLZ: Lock acquired: 140251854221392 15627 1726882462.58101: ANSIBALLZ: Creating module 15627 1726882462.81802: ANSIBALLZ: Writing module into payload 15627 1726882462.82035: ANSIBALLZ: Writing module 15627 1726882462.82065: ANSIBALLZ: Renaming module 15627 1726882462.82075: ANSIBALLZ: Done creating module 15627 1726882462.82094: variable 'ansible_facts' from source: unknown 15627 1726882462.82204: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882462.5396156-15750-19432718362605/AnsiballZ_stat.py 15627 1726882462.82902: Sending initial data 15627 1726882462.82906: Sent initial data (152 bytes) 15627 1726882462.83915: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882462.83924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882462.84279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882462.84283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882462.84325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882462.84329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 15627 1726882462.84348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882462.84353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882462.84368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882462.84439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882462.84462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882462.84469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882462.84595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882462.86594: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882462.86687: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882462.86784: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpcqqnwspe /root/.ansible/tmp/ansible-tmp-1726882462.5396156-15750-19432718362605/AnsiballZ_stat.py <<< 15627 1726882462.86886: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882462.88233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882462.88320: stderr chunk (state=3): >>><<< 15627 1726882462.88323: stdout chunk (state=3): >>><<< 15627 1726882462.88346: done transferring module to remote 15627 1726882462.88362: _low_level_execute_command(): starting 15627 1726882462.88370: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882462.5396156-15750-19432718362605/ /root/.ansible/tmp/ansible-tmp-1726882462.5396156-15750-19432718362605/AnsiballZ_stat.py && sleep 0' 15627 1726882462.89032: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882462.89040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882462.89051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882462.89072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882462.89114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882462.89127: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882462.89130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882462.89145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882462.89152: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882462.89163: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882462.89173: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882462.89183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882462.89194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882462.89202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882462.89208: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882462.89218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882462.89294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882462.89309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882462.89312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882462.89448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882462.91932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882462.92031: stderr chunk (state=3): >>><<< 15627 1726882462.92036: stdout chunk (state=3): >>><<< 15627 1726882462.92054: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15627 1726882462.92062: _low_level_execute_command(): starting 15627 1726882462.92078: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882462.5396156-15750-19432718362605/AnsiballZ_stat.py && sleep 0' 15627 1726882462.92732: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882462.92739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882462.92753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882462.92770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882462.92809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882462.92818: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882462.92827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882462.92841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882462.92848: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882462.92853: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882462.92868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882462.92876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882462.92888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882462.92895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882462.92902: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882462.92915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882462.92997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882462.93000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882462.93008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882462.93484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882462.96356: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 15627 1726882462.96363: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 15627 1726882462.96450: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 15627 1726882462.96506: stdout chunk (state=3): >>>import 'posix' # <<< 15627 1726882462.96539: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 15627 1726882462.96544: stdout chunk (state=3): >>># installing zipimport hook <<< 15627 1726882462.96596: stdout chunk (state=3): >>>import 'time' # <<< 15627 1726882462.96603: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 15627 1726882462.96689: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882462.96755: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 15627 1726882462.96759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 15627 1726882462.96761: stdout chunk (state=3): >>>import '_codecs' # <<< 15627 1726882462.96783: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f2d8dc0> <<< 15627 1726882462.96850: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 15627 1726882462.96880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f27d3a0> <<< 15627 1726882462.96896: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f2d8b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 15627 1726882462.96927: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f2d8ac0> <<< 15627 1726882462.96944: stdout chunk (state=3): >>>import '_signal' # <<< 15627 1726882462.96983: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 15627 1726882462.96986: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f27d490> <<< 15627 1726882462.97012: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 15627 1726882462.97038: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 15627 1726882462.97086: stdout chunk (state=3): >>>import '_abc' # <<< 15627 1726882462.97089: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f27d940> <<< 15627 1726882462.97101: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f27d670> <<< 15627 1726882462.97129: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 15627 1726882462.97149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 15627 1726882462.97175: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 15627 1726882462.97198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 15627 1726882462.97222: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 15627 1726882462.97243: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 15627 1726882462.97271: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f234190> <<< 15627 1726882462.97303: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 15627 1726882462.97323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 15627 1726882462.97426: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f234220> <<< 15627 1726882462.97472: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 15627 1726882462.97486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 15627 1726882462.97518: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f257850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f234940> <<< 15627 1726882462.97552: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f295880> <<< 15627 1726882462.97583: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 15627 1726882462.97604: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f22dd90> <<< 15627 1726882462.97657: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 15627 1726882462.97672: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f257d90> <<< 15627 1726882462.97745: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f27d970> <<< 15627 1726882462.97783: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15627 1726882462.98084: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 15627 1726882462.98097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 15627 1726882462.98140: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 15627 1726882462.98142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 15627 1726882462.98160: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 15627 1726882462.98195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 15627 1726882462.98231: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 15627 1726882462.98234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 15627 1726882462.98245: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efaeeb0> <<< 15627 1726882462.98318: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efb1f40> <<< 15627 1726882462.98337: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 15627 1726882462.98350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 15627 1726882462.98368: stdout chunk (state=3): >>>import '_sre' # <<< 15627 1726882462.98405: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 15627 1726882462.98419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 15627 1726882462.98440: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 15627 1726882462.98472: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efa7610> <<< 15627 1726882462.98491: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efad640> <<< 15627 1726882462.98511: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efae370> <<< 15627 1726882462.98532: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 15627 1726882462.98627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 15627 1726882462.98651: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 15627 1726882462.98693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882462.98729: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 15627 1726882462.98748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 15627 1726882462.98762: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ef2fe20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef2f910> <<< 15627 1726882462.98784: stdout chunk (state=3): >>>import 'itertools' # <<< 15627 1726882462.98810: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef2ff10> <<< 15627 1726882462.98837: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 15627 1726882462.98868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 15627 1726882462.98895: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef2ffd0> <<< 15627 1726882462.98927: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 15627 1726882462.98950: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef420d0> import '_collections' # <<< 15627 1726882462.99008: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef89d90> <<< 15627 1726882462.99028: stdout chunk (state=3): >>>import '_functools' # <<< 15627 1726882462.99059: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef82670> <<< 15627 1726882462.99139: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef956d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efb5e20> <<< 15627 1726882462.99175: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 15627 1726882462.99215: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ef42cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef892b0> <<< 15627 1726882462.99270: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882462.99299: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ef952e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efbb9d0> <<< 15627 1726882462.99330: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 15627 1726882462.99365: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 15627 1726882462.99391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 15627 1726882462.99439: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef42eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef42df0> <<< 15627 1726882462.99469: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef42d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 15627 1726882462.99500: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 15627 1726882462.99524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 15627 1726882462.99546: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 15627 1726882462.99603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 15627 1726882462.99847: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef153d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef154c0> <<< 15627 1726882462.99898: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef49f40> <<< 15627 1726882462.99953: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef44a90> <<< 15627 1726882462.99978: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef44490> <<< 15627 1726882463.00001: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 15627 1726882463.00043: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 15627 1726882463.00074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 15627 1726882463.00093: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee49220> <<< 15627 1726882463.00135: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef00520> <<< 15627 1726882463.00206: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef44f10> <<< 15627 1726882463.00221: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efbb040> <<< 15627 1726882463.00238: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 15627 1726882463.00274: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 15627 1726882463.00308: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee5bb50> <<< 15627 1726882463.00330: stdout chunk (state=3): >>>import 'errno' # <<< 15627 1726882463.00361: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee5be80> <<< 15627 1726882463.00384: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 15627 1726882463.00420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 15627 1726882463.00441: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee6c790> <<< 15627 1726882463.00467: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 15627 1726882463.00511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 15627 1726882463.00549: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee6ccd0> <<< 15627 1726882463.00589: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0edfa400> <<< 15627 1726882463.00601: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee5bf70> <<< 15627 1726882463.00632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 15627 1726882463.00691: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee0b2e0> <<< 15627 1726882463.00710: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee6c610> import 'pwd' # <<< 15627 1726882463.00746: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee0b3a0> <<< 15627 1726882463.00793: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef42a30> <<< 15627 1726882463.00827: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 15627 1726882463.00838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 15627 1726882463.00864: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 15627 1726882463.00884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 15627 1726882463.00915: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee26700> <<< 15627 1726882463.00944: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 15627 1726882463.00987: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee269d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee267c0> <<< 15627 1726882463.01015: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee268b0> <<< 15627 1726882463.01055: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 15627 1726882463.01069: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 15627 1726882463.01327: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee26d00> <<< 15627 1726882463.01379: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee31250> <<< 15627 1726882463.01408: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee26940> <<< 15627 1726882463.01429: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee1aa90> <<< 15627 1726882463.01441: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef42610> <<< 15627 1726882463.01463: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 15627 1726882463.01547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 15627 1726882463.01582: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee26af0> <<< 15627 1726882463.01717: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 15627 1726882463.01740: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fde0ed426d0> <<< 15627 1726882463.02035: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip' # zipimport: zlib available <<< 15627 1726882463.02180: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.02208: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/__init__.py <<< 15627 1726882463.02226: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15627 1726882463.02255: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 15627 1726882463.02268: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.04181: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.05734: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e724820> <<< 15627 1726882463.05739: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882463.05784: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 15627 1726882463.05814: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e7b3730> <<< 15627 1726882463.05867: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7b3610> <<< 15627 1726882463.05926: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7b3340> <<< 15627 1726882463.05940: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 15627 1726882463.06007: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7b3460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7b3160> <<< 15627 1726882463.06026: stdout chunk (state=3): >>>import 'atexit' # <<< 15627 1726882463.06058: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e7b33a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 15627 1726882463.06100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 15627 1726882463.06168: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7b3790> <<< 15627 1726882463.06204: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 15627 1726882463.06207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 15627 1726882463.06236: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 15627 1726882463.06266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 15627 1726882463.06283: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 15627 1726882463.06385: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e6a47f0> <<< 15627 1726882463.06429: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e6a4b80> <<< 15627 1726882463.06466: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e6a49d0> <<< 15627 1726882463.06503: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 15627 1726882463.06522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 15627 1726882463.06592: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e6c3af0> <<< 15627 1726882463.06596: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7add60> <<< 15627 1726882463.06850: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7b34f0> <<< 15627 1726882463.06885: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc'<<< 15627 1726882463.06919: stdout chunk (state=3): >>> <<< 15627 1726882463.06968: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7ad1c0> <<< 15627 1726882463.07008: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 15627 1726882463.07054: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 15627 1726882463.07079: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 15627 1726882463.07082: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e71fb20> <<< 15627 1726882463.07789: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e755eb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7558b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e6be2e0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e7559a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e784d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e685a00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e78ce80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e6940a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e78ceb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e759730> <<< 15627 1726882463.07980: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e6940d0> <<< 15627 1726882463.08121: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882463.08124: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e691550> <<< 15627 1726882463.08164: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882463.08178: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e691610> <<< 15627 1726882463.08234: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882463.08249: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e690c40> <<< 15627 1726882463.08275: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e784ee0> <<< 15627 1726882463.08300: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 15627 1726882463.08315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 15627 1726882463.08335: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 15627 1726882463.08375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 15627 1726882463.08439: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882463.08452: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e714b50> <<< 15627 1726882463.08793: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882463.08796: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e713940> <<< 15627 1726882463.08813: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e686820> <<< 15627 1726882463.08855: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882463.08868: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882463.08888: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e7145b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e74daf0> <<< 15627 1726882463.08900: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.08931: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.08943: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 15627 1726882463.08964: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.09093: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.09215: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.09240: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.09255: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 15627 1726882463.09287: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.09290: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.09300: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 15627 1726882463.09321: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.09492: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.09659: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.10557: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.11345: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 15627 1726882463.11390: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 15627 1726882463.11393: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 15627 1726882463.11427: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 15627 1726882463.11448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882463.11551: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882463.11554: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e257df0> <<< 15627 1726882463.11654: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 15627 1726882463.11684: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e6605b0> <<< 15627 1726882463.11706: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e651df0> <<< 15627 1726882463.11766: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 15627 1726882463.11781: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.11815: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.11852: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 15627 1726882463.11857: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.12039: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.12266: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 15627 1726882463.12270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 15627 1726882463.12316: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e70a9d0> <<< 15627 1726882463.12329: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.12996: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.13611: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.13711: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.13815: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 15627 1726882463.13828: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.13869: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.13946: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 15627 1726882463.13949: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.14043: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.14179: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 15627 1726882463.14198: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.14229: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.14233: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 15627 1726882463.14291: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.14335: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 15627 1726882463.14357: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.14674: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.14999: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 15627 1726882463.15046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 15627 1726882463.15086: stdout chunk (state=3): >>>import '_ast' # <<< 15627 1726882463.15193: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e228e50> <<< 15627 1726882463.15216: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.15312: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.15415: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/validation.py <<< 15627 1726882463.15435: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 15627 1726882463.15470: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 15627 1726882463.15488: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.15529: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.15588: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 15627 1726882463.15601: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.15648: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.15706: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.15844: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.15950: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 15627 1726882463.16000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 15627 1726882463.16127: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 15627 1726882463.16140: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e79e910> <<< 15627 1726882463.16180: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e228be0> <<< 15627 1726882463.16237: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/file.py <<< 15627 1726882463.16256: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 15627 1726882463.16274: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.16469: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.16553: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.16594: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.16644: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 15627 1726882463.16679: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 15627 1726882463.16713: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 15627 1726882463.16777: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 15627 1726882463.16794: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 15627 1726882463.16819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 15627 1726882463.16987: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e1eac70> <<< 15627 1726882463.17046: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e653670> <<< 15627 1726882463.17150: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e652850> <<< 15627 1726882463.17172: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 15627 1726882463.17183: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.17208: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.17245: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py <<< 15627 1726882463.17262: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 15627 1726882463.17350: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 15627 1726882463.17382: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.17402: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 15627 1726882463.17430: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.17612: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.17900: stdout chunk (state=3): >>># zipimport: zlib available <<< 15627 1726882463.18114: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 15627 1726882463.18141: stdout chunk (state=3): >>># destroy __main__ <<< 15627 1726882463.18514: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv <<< 15627 1726882463.18927: stdout chunk (state=3): >>># clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder <<< 15627 1726882463.18994: stdout chunk (state=3): >>># cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd <<< 15627 1726882463.19041: stdout chunk (state=3): >>># destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal <<< 15627 1726882463.19133: stdout chunk (state=3): >>># cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text <<< 15627 1726882463.19226: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast<<< 15627 1726882463.19287: stdout chunk (state=3): >>> # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext <<< 15627 1726882463.19328: stdout chunk (state=3): >>># cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 15627 1726882463.19562: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15627 1726882463.19585: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 15627 1726882463.19614: stdout chunk (state=3): >>># destroy zipimport <<< 15627 1726882463.19652: stdout chunk (state=3): >>># destroy _compression <<< 15627 1726882463.19698: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 15627 1726882463.19738: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 15627 1726882463.19757: stdout chunk (state=3): >>># destroy _json # destroy encodings <<< 15627 1726882463.19785: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 15627 1726882463.19817: stdout chunk (state=3): >>># destroy array <<< 15627 1726882463.19842: stdout chunk (state=3): >>># destroy datetime <<< 15627 1726882463.19874: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging<<< 15627 1726882463.19893: stdout chunk (state=3): >>> # destroy argparse <<< 15627 1726882463.19940: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 15627 1726882463.20016: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 15627 1726882463.20044: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache <<< 15627 1726882463.20093: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 15627 1726882463.20147: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl <<< 15627 1726882463.20174: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 15627 1726882463.20227: stdout chunk (state=3): >>># destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd <<< 15627 1726882463.20276: stdout chunk (state=3): >>># cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 15627 1726882463.20347: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 15627 1726882463.20415: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq <<< 15627 1726882463.20496: stdout chunk (state=3): >>># cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath <<< 15627 1726882463.20554: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time<<< 15627 1726882463.20617: stdout chunk (state=3): >>> # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal <<< 15627 1726882463.20652: stdout chunk (state=3): >>># cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 15627 1726882463.20712: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib<<< 15627 1726882463.20757: stdout chunk (state=3): >>> # destroy _signal <<< 15627 1726882463.20920: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 15627 1726882463.20948: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize <<< 15627 1726882463.20981: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath <<< 15627 1726882463.21025: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib<<< 15627 1726882463.21051: stdout chunk (state=3): >>> # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 15627 1726882463.21101: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response <<< 15627 1726882463.21139: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 15627 1726882463.21153: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 15627 1726882463.21206: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 15627 1726882463.21685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882463.21694: stdout chunk (state=3): >>><<< 15627 1726882463.21705: stderr chunk (state=3): >>><<< 15627 1726882463.21801: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f2d8dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f27d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f2d8b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f2d8ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f27d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f27d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f27d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f234190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f234220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f257850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f234940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f295880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f22dd90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f257d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0f27d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efaeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efb1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efa7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efad640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efae370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ef2fe20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef2f910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef2ff10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef2ffd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef420d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef89d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef82670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef956d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efb5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ef42cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef892b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ef952e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efbb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef42eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef42df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef42d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef153d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef154c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef49f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef44a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef44490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee49220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef00520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef44f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0efbb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee5bb50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee5be80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee6c790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee6ccd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0edfa400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee5bf70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee0b2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee6c610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee0b3a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef42a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee26700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee269d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee267c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee268b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee26d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0ee31250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee26940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee1aa90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ef42610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0ee26af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fde0ed426d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e724820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e7b3730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7b3610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7b3340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7b3460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7b3160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e7b33a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7b3790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e6a47f0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e6a4b80> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e6a49d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e6c3af0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7add60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7b34f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7ad1c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e71fb20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e755eb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e7558b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e6be2e0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e7559a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e784d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e685a00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e78ce80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e6940a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e78ceb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e759730> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e6940d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e691550> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e691610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e690c40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e784ee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e714b50> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e713940> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e686820> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e7145b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e74daf0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e257df0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e6605b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e651df0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e70a9d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e228e50> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fde0e79e910> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e228be0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e1eac70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e653670> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fde0e652850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_h3ljsj7i/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 15627 1726882463.22507: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882462.5396156-15750-19432718362605/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882463.22510: _low_level_execute_command(): starting 15627 1726882463.22513: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882462.5396156-15750-19432718362605/ > /dev/null 2>&1 && sleep 0' 15627 1726882463.23123: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882463.23137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882463.23159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882463.23180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882463.23220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882463.23234: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882463.23248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882463.23275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882463.23287: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882463.23297: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882463.23307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882463.23320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882463.23334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882463.23346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882463.23358: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882463.23375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882463.23457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882463.23483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882463.23506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882463.23639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882463.26398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882463.26402: stdout chunk (state=3): >>><<< 15627 1726882463.26405: stderr chunk (state=3): >>><<< 15627 1726882463.26678: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15627 1726882463.26681: handler run complete 15627 1726882463.26684: attempt loop complete, returning result 15627 1726882463.26686: _execute() done 15627 1726882463.26688: dumping result to json 15627 1726882463.26690: done dumping result, returning 15627 1726882463.26692: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0e448fcc-3ce9-2847-7723-000000000091] 15627 1726882463.26694: sending task result for task 0e448fcc-3ce9-2847-7723-000000000091 15627 1726882463.26760: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000091 15627 1726882463.26763: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15627 1726882463.26838: no more pending results, returning what we have 15627 1726882463.26841: results queue empty 15627 1726882463.26842: checking for any_errors_fatal 15627 1726882463.26849: done checking for any_errors_fatal 15627 1726882463.26850: checking for max_fail_percentage 15627 1726882463.26851: done checking for max_fail_percentage 15627 1726882463.26852: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.26853: done checking to see if all hosts have failed 15627 1726882463.26854: getting the remaining hosts for this loop 15627 1726882463.26856: done getting the remaining hosts for this loop 15627 1726882463.26859: getting the next task for host managed_node1 15627 1726882463.26868: done getting next task for host managed_node1 15627 1726882463.26870: ^ task is: TASK: Set flag to indicate system is ostree 15627 1726882463.26873: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.26876: getting variables 15627 1726882463.26878: in VariableManager get_vars() 15627 1726882463.26906: Calling all_inventory to load vars for managed_node1 15627 1726882463.26909: Calling groups_inventory to load vars for managed_node1 15627 1726882463.26912: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.26923: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.26927: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.26930: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.27096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.27424: done with get_vars() 15627 1726882463.27433: done getting variables 15627 1726882463.27532: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:34:23 -0400 (0:00:00.799) 0:00:03.027 ****** 15627 1726882463.27567: entering _queue_task() for managed_node1/set_fact 15627 1726882463.27569: Creating lock for set_fact 15627 1726882463.27859: worker is 1 (out of 1 available) 15627 1726882463.27873: exiting _queue_task() for managed_node1/set_fact 15627 1726882463.27883: done queuing things up, now waiting for results queue to drain 15627 1726882463.27885: waiting for pending results... 15627 1726882463.28126: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 15627 1726882463.28238: in run() - task 0e448fcc-3ce9-2847-7723-000000000092 15627 1726882463.28262: variable 'ansible_search_path' from source: unknown 15627 1726882463.28271: variable 'ansible_search_path' from source: unknown 15627 1726882463.28308: calling self._execute() 15627 1726882463.28388: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.28399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.28411: variable 'omit' from source: magic vars 15627 1726882463.28928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882463.29194: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882463.29244: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882463.29282: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882463.29320: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882463.29418: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882463.29457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882463.29495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882463.29533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882463.29682: Evaluated conditional (not __network_is_ostree is defined): True 15627 1726882463.29693: variable 'omit' from source: magic vars 15627 1726882463.29734: variable 'omit' from source: magic vars 15627 1726882463.29872: variable '__ostree_booted_stat' from source: set_fact 15627 1726882463.29932: variable 'omit' from source: magic vars 15627 1726882463.29965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882463.30008: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882463.30048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882463.30075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882463.30090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882463.30132: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882463.30140: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.30148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.30260: Set connection var ansible_timeout to 10 15627 1726882463.30278: Set connection var ansible_shell_executable to /bin/sh 15627 1726882463.30292: Set connection var ansible_connection to ssh 15627 1726882463.30302: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882463.30316: Set connection var ansible_pipelining to False 15627 1726882463.30328: Set connection var ansible_shell_type to sh 15627 1726882463.30357: variable 'ansible_shell_executable' from source: unknown 15627 1726882463.30368: variable 'ansible_connection' from source: unknown 15627 1726882463.30376: variable 'ansible_module_compression' from source: unknown 15627 1726882463.30382: variable 'ansible_shell_type' from source: unknown 15627 1726882463.30388: variable 'ansible_shell_executable' from source: unknown 15627 1726882463.30400: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.30407: variable 'ansible_pipelining' from source: unknown 15627 1726882463.30414: variable 'ansible_timeout' from source: unknown 15627 1726882463.30426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.30550: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882463.30567: variable 'omit' from source: magic vars 15627 1726882463.30577: starting attempt loop 15627 1726882463.30584: running the handler 15627 1726882463.30599: handler run complete 15627 1726882463.30618: attempt loop complete, returning result 15627 1726882463.30625: _execute() done 15627 1726882463.30631: dumping result to json 15627 1726882463.30643: done dumping result, returning 15627 1726882463.30659: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-2847-7723-000000000092] 15627 1726882463.30672: sending task result for task 0e448fcc-3ce9-2847-7723-000000000092 ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 15627 1726882463.30816: no more pending results, returning what we have 15627 1726882463.30819: results queue empty 15627 1726882463.30820: checking for any_errors_fatal 15627 1726882463.30827: done checking for any_errors_fatal 15627 1726882463.30828: checking for max_fail_percentage 15627 1726882463.30830: done checking for max_fail_percentage 15627 1726882463.30831: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.30832: done checking to see if all hosts have failed 15627 1726882463.30833: getting the remaining hosts for this loop 15627 1726882463.30835: done getting the remaining hosts for this loop 15627 1726882463.30839: getting the next task for host managed_node1 15627 1726882463.30849: done getting next task for host managed_node1 15627 1726882463.30853: ^ task is: TASK: Fix CentOS6 Base repo 15627 1726882463.30855: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.30859: getting variables 15627 1726882463.30861: in VariableManager get_vars() 15627 1726882463.30894: Calling all_inventory to load vars for managed_node1 15627 1726882463.30897: Calling groups_inventory to load vars for managed_node1 15627 1726882463.30900: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.30912: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.30915: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.30919: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.31099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.31327: done with get_vars() 15627 1726882463.31337: done getting variables 15627 1726882463.31518: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000092 15627 1726882463.31521: WORKER PROCESS EXITING 15627 1726882463.31578: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:34:23 -0400 (0:00:00.040) 0:00:03.067 ****** 15627 1726882463.31724: entering _queue_task() for managed_node1/copy 15627 1726882463.32025: worker is 1 (out of 1 available) 15627 1726882463.32039: exiting _queue_task() for managed_node1/copy 15627 1726882463.32058: done queuing things up, now waiting for results queue to drain 15627 1726882463.32059: waiting for pending results... 15627 1726882463.32320: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 15627 1726882463.32432: in run() - task 0e448fcc-3ce9-2847-7723-000000000094 15627 1726882463.32450: variable 'ansible_search_path' from source: unknown 15627 1726882463.32457: variable 'ansible_search_path' from source: unknown 15627 1726882463.32509: calling self._execute() 15627 1726882463.32587: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.32608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.32625: variable 'omit' from source: magic vars 15627 1726882463.33102: variable 'ansible_distribution' from source: facts 15627 1726882463.33127: Evaluated conditional (ansible_distribution == 'CentOS'): True 15627 1726882463.33249: variable 'ansible_distribution_major_version' from source: facts 15627 1726882463.33271: Evaluated conditional (ansible_distribution_major_version == '6'): False 15627 1726882463.33279: when evaluation is False, skipping this task 15627 1726882463.33285: _execute() done 15627 1726882463.33290: dumping result to json 15627 1726882463.33296: done dumping result, returning 15627 1726882463.33304: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-2847-7723-000000000094] 15627 1726882463.33313: sending task result for task 0e448fcc-3ce9-2847-7723-000000000094 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15627 1726882463.33469: no more pending results, returning what we have 15627 1726882463.33472: results queue empty 15627 1726882463.33473: checking for any_errors_fatal 15627 1726882463.33480: done checking for any_errors_fatal 15627 1726882463.33481: checking for max_fail_percentage 15627 1726882463.33483: done checking for max_fail_percentage 15627 1726882463.33484: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.33485: done checking to see if all hosts have failed 15627 1726882463.33485: getting the remaining hosts for this loop 15627 1726882463.33487: done getting the remaining hosts for this loop 15627 1726882463.33491: getting the next task for host managed_node1 15627 1726882463.33499: done getting next task for host managed_node1 15627 1726882463.33502: ^ task is: TASK: Include the task 'enable_epel.yml' 15627 1726882463.33505: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.33509: getting variables 15627 1726882463.33510: in VariableManager get_vars() 15627 1726882463.33541: Calling all_inventory to load vars for managed_node1 15627 1726882463.33544: Calling groups_inventory to load vars for managed_node1 15627 1726882463.33548: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.33562: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.33568: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.33571: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.33788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.33989: done with get_vars() 15627 1726882463.33999: done getting variables 15627 1726882463.34214: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000094 15627 1726882463.34217: WORKER PROCESS EXITING TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:34:23 -0400 (0:00:00.026) 0:00:03.093 ****** 15627 1726882463.34231: entering _queue_task() for managed_node1/include_tasks 15627 1726882463.34608: worker is 1 (out of 1 available) 15627 1726882463.34619: exiting _queue_task() for managed_node1/include_tasks 15627 1726882463.34630: done queuing things up, now waiting for results queue to drain 15627 1726882463.34631: waiting for pending results... 15627 1726882463.34866: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 15627 1726882463.34975: in run() - task 0e448fcc-3ce9-2847-7723-000000000095 15627 1726882463.35000: variable 'ansible_search_path' from source: unknown 15627 1726882463.35006: variable 'ansible_search_path' from source: unknown 15627 1726882463.35043: calling self._execute() 15627 1726882463.35124: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.35135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.35148: variable 'omit' from source: magic vars 15627 1726882463.35657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882463.38173: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882463.38243: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882463.38294: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882463.38331: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882463.38370: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882463.38449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882463.38503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882463.38532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882463.38583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882463.38605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882463.38735: variable '__network_is_ostree' from source: set_fact 15627 1726882463.38756: Evaluated conditional (not __network_is_ostree | d(false)): True 15627 1726882463.38768: _execute() done 15627 1726882463.38775: dumping result to json 15627 1726882463.38781: done dumping result, returning 15627 1726882463.38790: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-2847-7723-000000000095] 15627 1726882463.38808: sending task result for task 0e448fcc-3ce9-2847-7723-000000000095 15627 1726882463.38937: no more pending results, returning what we have 15627 1726882463.38943: in VariableManager get_vars() 15627 1726882463.38978: Calling all_inventory to load vars for managed_node1 15627 1726882463.38981: Calling groups_inventory to load vars for managed_node1 15627 1726882463.38985: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.38995: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.38999: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.39002: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.39190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.39398: done with get_vars() 15627 1726882463.39406: variable 'ansible_search_path' from source: unknown 15627 1726882463.39407: variable 'ansible_search_path' from source: unknown 15627 1726882463.39445: we have included files to process 15627 1726882463.39446: generating all_blocks data 15627 1726882463.39448: done generating all_blocks data 15627 1726882463.39454: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15627 1726882463.39455: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15627 1726882463.39458: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15627 1726882463.40054: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000095 15627 1726882463.40057: WORKER PROCESS EXITING 15627 1726882463.40576: done processing included file 15627 1726882463.40578: iterating over new_blocks loaded from include file 15627 1726882463.40580: in VariableManager get_vars() 15627 1726882463.40597: done with get_vars() 15627 1726882463.40599: filtering new block on tags 15627 1726882463.40627: done filtering new block on tags 15627 1726882463.40630: in VariableManager get_vars() 15627 1726882463.40642: done with get_vars() 15627 1726882463.40643: filtering new block on tags 15627 1726882463.40655: done filtering new block on tags 15627 1726882463.40656: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 15627 1726882463.40662: extending task lists for all hosts with included blocks 15627 1726882463.40776: done extending task lists 15627 1726882463.40777: done processing included files 15627 1726882463.40778: results queue empty 15627 1726882463.40779: checking for any_errors_fatal 15627 1726882463.40782: done checking for any_errors_fatal 15627 1726882463.40783: checking for max_fail_percentage 15627 1726882463.40784: done checking for max_fail_percentage 15627 1726882463.40785: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.40786: done checking to see if all hosts have failed 15627 1726882463.40787: getting the remaining hosts for this loop 15627 1726882463.40788: done getting the remaining hosts for this loop 15627 1726882463.40791: getting the next task for host managed_node1 15627 1726882463.40795: done getting next task for host managed_node1 15627 1726882463.40797: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 15627 1726882463.40800: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.40802: getting variables 15627 1726882463.40803: in VariableManager get_vars() 15627 1726882463.40817: Calling all_inventory to load vars for managed_node1 15627 1726882463.40823: Calling groups_inventory to load vars for managed_node1 15627 1726882463.40825: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.40831: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.40839: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.40842: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.41008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.41261: done with get_vars() 15627 1726882463.41271: done getting variables 15627 1726882463.41483: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 15627 1726882463.41691: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:34:23 -0400 (0:00:00.075) 0:00:03.169 ****** 15627 1726882463.41737: entering _queue_task() for managed_node1/command 15627 1726882463.41739: Creating lock for command 15627 1726882463.42485: worker is 1 (out of 1 available) 15627 1726882463.42497: exiting _queue_task() for managed_node1/command 15627 1726882463.42508: done queuing things up, now waiting for results queue to drain 15627 1726882463.42509: waiting for pending results... 15627 1726882463.43039: running TaskExecutor() for managed_node1/TASK: Create EPEL 9 15627 1726882463.43147: in run() - task 0e448fcc-3ce9-2847-7723-0000000000af 15627 1726882463.43168: variable 'ansible_search_path' from source: unknown 15627 1726882463.43174: variable 'ansible_search_path' from source: unknown 15627 1726882463.43214: calling self._execute() 15627 1726882463.43297: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.43307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.43319: variable 'omit' from source: magic vars 15627 1726882463.43688: variable 'ansible_distribution' from source: facts 15627 1726882463.43702: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15627 1726882463.43834: variable 'ansible_distribution_major_version' from source: facts 15627 1726882463.43847: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15627 1726882463.43853: when evaluation is False, skipping this task 15627 1726882463.43861: _execute() done 15627 1726882463.43869: dumping result to json 15627 1726882463.43878: done dumping result, returning 15627 1726882463.43887: done running TaskExecutor() for managed_node1/TASK: Create EPEL 9 [0e448fcc-3ce9-2847-7723-0000000000af] 15627 1726882463.43896: sending task result for task 0e448fcc-3ce9-2847-7723-0000000000af skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15627 1726882463.44062: no more pending results, returning what we have 15627 1726882463.44068: results queue empty 15627 1726882463.44069: checking for any_errors_fatal 15627 1726882463.44070: done checking for any_errors_fatal 15627 1726882463.44071: checking for max_fail_percentage 15627 1726882463.44073: done checking for max_fail_percentage 15627 1726882463.44073: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.44074: done checking to see if all hosts have failed 15627 1726882463.44075: getting the remaining hosts for this loop 15627 1726882463.44076: done getting the remaining hosts for this loop 15627 1726882463.44080: getting the next task for host managed_node1 15627 1726882463.44089: done getting next task for host managed_node1 15627 1726882463.44092: ^ task is: TASK: Install yum-utils package 15627 1726882463.44096: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.44100: getting variables 15627 1726882463.44101: in VariableManager get_vars() 15627 1726882463.44134: Calling all_inventory to load vars for managed_node1 15627 1726882463.44137: Calling groups_inventory to load vars for managed_node1 15627 1726882463.44141: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.44158: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.44162: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.44167: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.44365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.44626: done with get_vars() 15627 1726882463.44639: done getting variables 15627 1726882463.44827: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000000af 15627 1726882463.44830: WORKER PROCESS EXITING 15627 1726882463.44977: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:34:23 -0400 (0:00:00.032) 0:00:03.201 ****** 15627 1726882463.45010: entering _queue_task() for managed_node1/package 15627 1726882463.45012: Creating lock for package 15627 1726882463.45409: worker is 1 (out of 1 available) 15627 1726882463.45420: exiting _queue_task() for managed_node1/package 15627 1726882463.45431: done queuing things up, now waiting for results queue to drain 15627 1726882463.45432: waiting for pending results... 15627 1726882463.45685: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 15627 1726882463.45796: in run() - task 0e448fcc-3ce9-2847-7723-0000000000b0 15627 1726882463.45811: variable 'ansible_search_path' from source: unknown 15627 1726882463.45818: variable 'ansible_search_path' from source: unknown 15627 1726882463.45860: calling self._execute() 15627 1726882463.46021: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.46031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.46049: variable 'omit' from source: magic vars 15627 1726882463.46468: variable 'ansible_distribution' from source: facts 15627 1726882463.46489: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15627 1726882463.46642: variable 'ansible_distribution_major_version' from source: facts 15627 1726882463.46659: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15627 1726882463.46671: when evaluation is False, skipping this task 15627 1726882463.46678: _execute() done 15627 1726882463.46684: dumping result to json 15627 1726882463.46693: done dumping result, returning 15627 1726882463.46706: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0e448fcc-3ce9-2847-7723-0000000000b0] 15627 1726882463.46716: sending task result for task 0e448fcc-3ce9-2847-7723-0000000000b0 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15627 1726882463.46920: no more pending results, returning what we have 15627 1726882463.46924: results queue empty 15627 1726882463.46925: checking for any_errors_fatal 15627 1726882463.46929: done checking for any_errors_fatal 15627 1726882463.46930: checking for max_fail_percentage 15627 1726882463.46932: done checking for max_fail_percentage 15627 1726882463.46933: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.46934: done checking to see if all hosts have failed 15627 1726882463.46935: getting the remaining hosts for this loop 15627 1726882463.46936: done getting the remaining hosts for this loop 15627 1726882463.46940: getting the next task for host managed_node1 15627 1726882463.46948: done getting next task for host managed_node1 15627 1726882463.46951: ^ task is: TASK: Enable EPEL 7 15627 1726882463.46958: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.46961: getting variables 15627 1726882463.46962: in VariableManager get_vars() 15627 1726882463.46987: Calling all_inventory to load vars for managed_node1 15627 1726882463.46990: Calling groups_inventory to load vars for managed_node1 15627 1726882463.46995: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.47009: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.47012: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.47015: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.47183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.47400: done with get_vars() 15627 1726882463.47409: done getting variables 15627 1726882463.47578: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000000b0 15627 1726882463.47582: WORKER PROCESS EXITING 15627 1726882463.47619: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:34:23 -0400 (0:00:00.026) 0:00:03.228 ****** 15627 1726882463.47684: entering _queue_task() for managed_node1/command 15627 1726882463.48094: worker is 1 (out of 1 available) 15627 1726882463.48105: exiting _queue_task() for managed_node1/command 15627 1726882463.48116: done queuing things up, now waiting for results queue to drain 15627 1726882463.48117: waiting for pending results... 15627 1726882463.48371: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 15627 1726882463.48489: in run() - task 0e448fcc-3ce9-2847-7723-0000000000b1 15627 1726882463.48518: variable 'ansible_search_path' from source: unknown 15627 1726882463.48525: variable 'ansible_search_path' from source: unknown 15627 1726882463.48577: calling self._execute() 15627 1726882463.48660: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.48677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.48691: variable 'omit' from source: magic vars 15627 1726882463.50151: variable 'ansible_distribution' from source: facts 15627 1726882463.50257: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15627 1726882463.50451: variable 'ansible_distribution_major_version' from source: facts 15627 1726882463.50465: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15627 1726882463.50504: when evaluation is False, skipping this task 15627 1726882463.50513: _execute() done 15627 1726882463.50520: dumping result to json 15627 1726882463.50526: done dumping result, returning 15627 1726882463.50535: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0e448fcc-3ce9-2847-7723-0000000000b1] 15627 1726882463.50549: sending task result for task 0e448fcc-3ce9-2847-7723-0000000000b1 15627 1726882463.50654: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000000b1 15627 1726882463.50669: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15627 1726882463.50724: no more pending results, returning what we have 15627 1726882463.50728: results queue empty 15627 1726882463.50728: checking for any_errors_fatal 15627 1726882463.50735: done checking for any_errors_fatal 15627 1726882463.50736: checking for max_fail_percentage 15627 1726882463.50738: done checking for max_fail_percentage 15627 1726882463.50739: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.50739: done checking to see if all hosts have failed 15627 1726882463.50740: getting the remaining hosts for this loop 15627 1726882463.50742: done getting the remaining hosts for this loop 15627 1726882463.50745: getting the next task for host managed_node1 15627 1726882463.50752: done getting next task for host managed_node1 15627 1726882463.50757: ^ task is: TASK: Enable EPEL 8 15627 1726882463.50761: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.50768: getting variables 15627 1726882463.50770: in VariableManager get_vars() 15627 1726882463.50799: Calling all_inventory to load vars for managed_node1 15627 1726882463.50802: Calling groups_inventory to load vars for managed_node1 15627 1726882463.50806: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.50819: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.50822: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.50825: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.51014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.51246: done with get_vars() 15627 1726882463.51258: done getting variables 15627 1726882463.51333: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:34:23 -0400 (0:00:00.037) 0:00:03.265 ****** 15627 1726882463.51374: entering _queue_task() for managed_node1/command 15627 1726882463.51750: worker is 1 (out of 1 available) 15627 1726882463.51767: exiting _queue_task() for managed_node1/command 15627 1726882463.51778: done queuing things up, now waiting for results queue to drain 15627 1726882463.51779: waiting for pending results... 15627 1726882463.52009: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 15627 1726882463.52119: in run() - task 0e448fcc-3ce9-2847-7723-0000000000b2 15627 1726882463.52138: variable 'ansible_search_path' from source: unknown 15627 1726882463.52144: variable 'ansible_search_path' from source: unknown 15627 1726882463.52193: calling self._execute() 15627 1726882463.52274: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.52290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.52305: variable 'omit' from source: magic vars 15627 1726882463.52673: variable 'ansible_distribution' from source: facts 15627 1726882463.52689: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15627 1726882463.52820: variable 'ansible_distribution_major_version' from source: facts 15627 1726882463.52831: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15627 1726882463.52837: when evaluation is False, skipping this task 15627 1726882463.52844: _execute() done 15627 1726882463.52850: dumping result to json 15627 1726882463.52859: done dumping result, returning 15627 1726882463.52872: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0e448fcc-3ce9-2847-7723-0000000000b2] 15627 1726882463.52881: sending task result for task 0e448fcc-3ce9-2847-7723-0000000000b2 15627 1726882463.52984: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000000b2 15627 1726882463.52992: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15627 1726882463.53037: no more pending results, returning what we have 15627 1726882463.53040: results queue empty 15627 1726882463.53041: checking for any_errors_fatal 15627 1726882463.53048: done checking for any_errors_fatal 15627 1726882463.53048: checking for max_fail_percentage 15627 1726882463.53050: done checking for max_fail_percentage 15627 1726882463.53051: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.53052: done checking to see if all hosts have failed 15627 1726882463.53053: getting the remaining hosts for this loop 15627 1726882463.53057: done getting the remaining hosts for this loop 15627 1726882463.53061: getting the next task for host managed_node1 15627 1726882463.53073: done getting next task for host managed_node1 15627 1726882463.53076: ^ task is: TASK: Enable EPEL 6 15627 1726882463.53080: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.53084: getting variables 15627 1726882463.53085: in VariableManager get_vars() 15627 1726882463.53113: Calling all_inventory to load vars for managed_node1 15627 1726882463.53116: Calling groups_inventory to load vars for managed_node1 15627 1726882463.53119: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.53133: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.53136: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.53139: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.53320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.53513: done with get_vars() 15627 1726882463.53523: done getting variables 15627 1726882463.53593: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:34:23 -0400 (0:00:00.022) 0:00:03.287 ****** 15627 1726882463.53625: entering _queue_task() for managed_node1/copy 15627 1726882463.54445: worker is 1 (out of 1 available) 15627 1726882463.54459: exiting _queue_task() for managed_node1/copy 15627 1726882463.54515: done queuing things up, now waiting for results queue to drain 15627 1726882463.54517: waiting for pending results... 15627 1726882463.55266: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 15627 1726882463.55415: in run() - task 0e448fcc-3ce9-2847-7723-0000000000b4 15627 1726882463.55433: variable 'ansible_search_path' from source: unknown 15627 1726882463.55440: variable 'ansible_search_path' from source: unknown 15627 1726882463.55486: calling self._execute() 15627 1726882463.55577: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.55603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.55619: variable 'omit' from source: magic vars 15627 1726882463.56016: variable 'ansible_distribution' from source: facts 15627 1726882463.56038: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15627 1726882463.56162: variable 'ansible_distribution_major_version' from source: facts 15627 1726882463.56176: Evaluated conditional (ansible_distribution_major_version == '6'): False 15627 1726882463.56184: when evaluation is False, skipping this task 15627 1726882463.56192: _execute() done 15627 1726882463.56199: dumping result to json 15627 1726882463.56207: done dumping result, returning 15627 1726882463.56218: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0e448fcc-3ce9-2847-7723-0000000000b4] 15627 1726882463.56229: sending task result for task 0e448fcc-3ce9-2847-7723-0000000000b4 15627 1726882463.56343: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000000b4 15627 1726882463.56350: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15627 1726882463.56405: no more pending results, returning what we have 15627 1726882463.56409: results queue empty 15627 1726882463.56409: checking for any_errors_fatal 15627 1726882463.56415: done checking for any_errors_fatal 15627 1726882463.56416: checking for max_fail_percentage 15627 1726882463.56418: done checking for max_fail_percentage 15627 1726882463.56418: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.56420: done checking to see if all hosts have failed 15627 1726882463.56420: getting the remaining hosts for this loop 15627 1726882463.56422: done getting the remaining hosts for this loop 15627 1726882463.56426: getting the next task for host managed_node1 15627 1726882463.56437: done getting next task for host managed_node1 15627 1726882463.56441: ^ task is: TASK: Set network provider to 'nm' 15627 1726882463.56444: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.56448: getting variables 15627 1726882463.56450: in VariableManager get_vars() 15627 1726882463.56485: Calling all_inventory to load vars for managed_node1 15627 1726882463.56488: Calling groups_inventory to load vars for managed_node1 15627 1726882463.56492: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.56505: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.56509: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.56512: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.56757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.56951: done with get_vars() 15627 1726882463.57149: done getting variables 15627 1726882463.57219: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:13 Friday 20 September 2024 21:34:23 -0400 (0:00:00.037) 0:00:03.325 ****** 15627 1726882463.57362: entering _queue_task() for managed_node1/set_fact 15627 1726882463.57679: worker is 1 (out of 1 available) 15627 1726882463.57689: exiting _queue_task() for managed_node1/set_fact 15627 1726882463.57700: done queuing things up, now waiting for results queue to drain 15627 1726882463.57701: waiting for pending results... 15627 1726882463.58410: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 15627 1726882463.58510: in run() - task 0e448fcc-3ce9-2847-7723-000000000007 15627 1726882463.58528: variable 'ansible_search_path' from source: unknown 15627 1726882463.58572: calling self._execute() 15627 1726882463.58652: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.58670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.58690: variable 'omit' from source: magic vars 15627 1726882463.58818: variable 'omit' from source: magic vars 15627 1726882463.58851: variable 'omit' from source: magic vars 15627 1726882463.58894: variable 'omit' from source: magic vars 15627 1726882463.58943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882463.59002: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882463.59038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882463.59067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882463.59085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882463.59123: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882463.59133: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.59141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.59271: Set connection var ansible_timeout to 10 15627 1726882463.59285: Set connection var ansible_shell_executable to /bin/sh 15627 1726882463.59293: Set connection var ansible_connection to ssh 15627 1726882463.59301: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882463.59309: Set connection var ansible_pipelining to False 15627 1726882463.59314: Set connection var ansible_shell_type to sh 15627 1726882463.59340: variable 'ansible_shell_executable' from source: unknown 15627 1726882463.59352: variable 'ansible_connection' from source: unknown 15627 1726882463.59363: variable 'ansible_module_compression' from source: unknown 15627 1726882463.59372: variable 'ansible_shell_type' from source: unknown 15627 1726882463.59381: variable 'ansible_shell_executable' from source: unknown 15627 1726882463.59395: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.59403: variable 'ansible_pipelining' from source: unknown 15627 1726882463.59409: variable 'ansible_timeout' from source: unknown 15627 1726882463.59416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.59577: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882463.59594: variable 'omit' from source: magic vars 15627 1726882463.59612: starting attempt loop 15627 1726882463.59619: running the handler 15627 1726882463.59640: handler run complete 15627 1726882463.59668: attempt loop complete, returning result 15627 1726882463.59677: _execute() done 15627 1726882463.59685: dumping result to json 15627 1726882463.59691: done dumping result, returning 15627 1726882463.59702: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0e448fcc-3ce9-2847-7723-000000000007] 15627 1726882463.59710: sending task result for task 0e448fcc-3ce9-2847-7723-000000000007 ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 15627 1726882463.59851: no more pending results, returning what we have 15627 1726882463.59857: results queue empty 15627 1726882463.59858: checking for any_errors_fatal 15627 1726882463.59866: done checking for any_errors_fatal 15627 1726882463.59867: checking for max_fail_percentage 15627 1726882463.59869: done checking for max_fail_percentage 15627 1726882463.59870: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.59872: done checking to see if all hosts have failed 15627 1726882463.59872: getting the remaining hosts for this loop 15627 1726882463.59874: done getting the remaining hosts for this loop 15627 1726882463.59878: getting the next task for host managed_node1 15627 1726882463.59887: done getting next task for host managed_node1 15627 1726882463.59889: ^ task is: TASK: meta (flush_handlers) 15627 1726882463.59891: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.59895: getting variables 15627 1726882463.59896: in VariableManager get_vars() 15627 1726882463.59926: Calling all_inventory to load vars for managed_node1 15627 1726882463.59929: Calling groups_inventory to load vars for managed_node1 15627 1726882463.59932: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.59943: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.59947: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.59950: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.60134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.60337: done with get_vars() 15627 1726882463.60346: done getting variables 15627 1726882463.60424: in VariableManager get_vars() 15627 1726882463.60433: Calling all_inventory to load vars for managed_node1 15627 1726882463.60435: Calling groups_inventory to load vars for managed_node1 15627 1726882463.60437: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.60442: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.60444: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.60447: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.60807: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000007 15627 1726882463.60810: WORKER PROCESS EXITING 15627 1726882463.60840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.61123: done with get_vars() 15627 1726882463.61136: done queuing things up, now waiting for results queue to drain 15627 1726882463.61138: results queue empty 15627 1726882463.61139: checking for any_errors_fatal 15627 1726882463.61141: done checking for any_errors_fatal 15627 1726882463.61142: checking for max_fail_percentage 15627 1726882463.61143: done checking for max_fail_percentage 15627 1726882463.61144: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.61144: done checking to see if all hosts have failed 15627 1726882463.61152: getting the remaining hosts for this loop 15627 1726882463.61156: done getting the remaining hosts for this loop 15627 1726882463.61159: getting the next task for host managed_node1 15627 1726882463.61163: done getting next task for host managed_node1 15627 1726882463.61166: ^ task is: TASK: meta (flush_handlers) 15627 1726882463.61168: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.61183: getting variables 15627 1726882463.61184: in VariableManager get_vars() 15627 1726882463.61192: Calling all_inventory to load vars for managed_node1 15627 1726882463.61194: Calling groups_inventory to load vars for managed_node1 15627 1726882463.61196: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.61206: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.61209: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.61212: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.61390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.61600: done with get_vars() 15627 1726882463.61608: done getting variables 15627 1726882463.61657: in VariableManager get_vars() 15627 1726882463.61668: Calling all_inventory to load vars for managed_node1 15627 1726882463.61670: Calling groups_inventory to load vars for managed_node1 15627 1726882463.61673: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.61684: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.61686: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.61689: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.61829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.62023: done with get_vars() 15627 1726882463.62035: done queuing things up, now waiting for results queue to drain 15627 1726882463.62037: results queue empty 15627 1726882463.62038: checking for any_errors_fatal 15627 1726882463.62039: done checking for any_errors_fatal 15627 1726882463.62040: checking for max_fail_percentage 15627 1726882463.62041: done checking for max_fail_percentage 15627 1726882463.62041: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.62042: done checking to see if all hosts have failed 15627 1726882463.62043: getting the remaining hosts for this loop 15627 1726882463.62044: done getting the remaining hosts for this loop 15627 1726882463.62046: getting the next task for host managed_node1 15627 1726882463.62049: done getting next task for host managed_node1 15627 1726882463.62050: ^ task is: None 15627 1726882463.62052: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.62053: done queuing things up, now waiting for results queue to drain 15627 1726882463.62056: results queue empty 15627 1726882463.62057: checking for any_errors_fatal 15627 1726882463.62058: done checking for any_errors_fatal 15627 1726882463.62068: checking for max_fail_percentage 15627 1726882463.62069: done checking for max_fail_percentage 15627 1726882463.62070: checking to see if all hosts have failed and the running result is not ok 15627 1726882463.62071: done checking to see if all hosts have failed 15627 1726882463.62073: getting the next task for host managed_node1 15627 1726882463.62075: done getting next task for host managed_node1 15627 1726882463.62076: ^ task is: None 15627 1726882463.62078: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.62134: in VariableManager get_vars() 15627 1726882463.62149: done with get_vars() 15627 1726882463.62167: in VariableManager get_vars() 15627 1726882463.62179: done with get_vars() 15627 1726882463.62183: variable 'omit' from source: magic vars 15627 1726882463.62215: in VariableManager get_vars() 15627 1726882463.62232: done with get_vars() 15627 1726882463.62267: variable 'omit' from source: magic vars PLAY [Test configuring bridges] ************************************************ 15627 1726882463.62590: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15627 1726882463.62616: getting the remaining hosts for this loop 15627 1726882463.62649: done getting the remaining hosts for this loop 15627 1726882463.62653: getting the next task for host managed_node1 15627 1726882463.62658: done getting next task for host managed_node1 15627 1726882463.62661: ^ task is: TASK: Gathering Facts 15627 1726882463.62665: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882463.62667: getting variables 15627 1726882463.62668: in VariableManager get_vars() 15627 1726882463.62675: Calling all_inventory to load vars for managed_node1 15627 1726882463.62677: Calling groups_inventory to load vars for managed_node1 15627 1726882463.62679: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882463.62684: Calling all_plugins_play to load vars for managed_node1 15627 1726882463.62697: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882463.62700: Calling groups_plugins_play to load vars for managed_node1 15627 1726882463.63476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882463.63661: done with get_vars() 15627 1726882463.63996: done getting variables 15627 1726882463.64048: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Friday 20 September 2024 21:34:23 -0400 (0:00:00.067) 0:00:03.392 ****** 15627 1726882463.64108: entering _queue_task() for managed_node1/gather_facts 15627 1726882463.64522: worker is 1 (out of 1 available) 15627 1726882463.64532: exiting _queue_task() for managed_node1/gather_facts 15627 1726882463.64543: done queuing things up, now waiting for results queue to drain 15627 1726882463.64545: waiting for pending results... 15627 1726882463.64789: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15627 1726882463.64888: in run() - task 0e448fcc-3ce9-2847-7723-0000000000da 15627 1726882463.64908: variable 'ansible_search_path' from source: unknown 15627 1726882463.64948: calling self._execute() 15627 1726882463.65032: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.65044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.65073: variable 'omit' from source: magic vars 15627 1726882463.65481: variable 'ansible_distribution_major_version' from source: facts 15627 1726882463.65499: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882463.65509: variable 'omit' from source: magic vars 15627 1726882463.65542: variable 'omit' from source: magic vars 15627 1726882463.65585: variable 'omit' from source: magic vars 15627 1726882463.65627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882463.65675: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882463.65714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882463.65742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882463.65769: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882463.65803: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882463.65811: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.65818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.65922: Set connection var ansible_timeout to 10 15627 1726882463.65936: Set connection var ansible_shell_executable to /bin/sh 15627 1726882463.65946: Set connection var ansible_connection to ssh 15627 1726882463.65958: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882463.65975: Set connection var ansible_pipelining to False 15627 1726882463.65981: Set connection var ansible_shell_type to sh 15627 1726882463.66008: variable 'ansible_shell_executable' from source: unknown 15627 1726882463.66015: variable 'ansible_connection' from source: unknown 15627 1726882463.66021: variable 'ansible_module_compression' from source: unknown 15627 1726882463.66027: variable 'ansible_shell_type' from source: unknown 15627 1726882463.66032: variable 'ansible_shell_executable' from source: unknown 15627 1726882463.66037: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882463.66044: variable 'ansible_pipelining' from source: unknown 15627 1726882463.66050: variable 'ansible_timeout' from source: unknown 15627 1726882463.66060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882463.66271: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882463.66285: variable 'omit' from source: magic vars 15627 1726882463.66306: starting attempt loop 15627 1726882463.66313: running the handler 15627 1726882463.66340: variable 'ansible_facts' from source: unknown 15627 1726882463.66371: _low_level_execute_command(): starting 15627 1726882463.66384: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882463.67295: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882463.67311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882463.67327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882463.67346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882463.67393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882463.67408: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882463.67423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882463.67441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882463.67452: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882463.67469: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882463.67481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882463.67494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882463.67508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882463.67523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882463.67535: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882463.67548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882463.67631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882463.67658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882463.67679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882463.67810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882463.70094: stdout chunk (state=3): >>>/root <<< 15627 1726882463.70244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882463.70298: stderr chunk (state=3): >>><<< 15627 1726882463.70301: stdout chunk (state=3): >>><<< 15627 1726882463.70332: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15627 1726882463.70347: _low_level_execute_command(): starting 15627 1726882463.70356: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882463.7033346-15800-157094926833464 `" && echo ansible-tmp-1726882463.7033346-15800-157094926833464="` echo /root/.ansible/tmp/ansible-tmp-1726882463.7033346-15800-157094926833464 `" ) && sleep 0' 15627 1726882463.70956: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882463.70973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882463.70988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882463.71005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882463.71045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882463.71057: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882463.71078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882463.71095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882463.71106: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882463.71116: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882463.71128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882463.71140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882463.71154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882463.71168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882463.71182: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882463.71198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882463.71272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882463.71297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882463.71315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882463.71446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882463.74341: stdout chunk (state=3): >>>ansible-tmp-1726882463.7033346-15800-157094926833464=/root/.ansible/tmp/ansible-tmp-1726882463.7033346-15800-157094926833464 <<< 15627 1726882463.74535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882463.74622: stderr chunk (state=3): >>><<< 15627 1726882463.74633: stdout chunk (state=3): >>><<< 15627 1726882463.74774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882463.7033346-15800-157094926833464=/root/.ansible/tmp/ansible-tmp-1726882463.7033346-15800-157094926833464 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15627 1726882463.74778: variable 'ansible_module_compression' from source: unknown 15627 1726882463.74781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15627 1726882463.74871: variable 'ansible_facts' from source: unknown 15627 1726882463.75050: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882463.7033346-15800-157094926833464/AnsiballZ_setup.py 15627 1726882463.75192: Sending initial data 15627 1726882463.75202: Sent initial data (154 bytes) 15627 1726882463.75852: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882463.75859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882463.75894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882463.75898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882463.75900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882463.75952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882463.75961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882463.76066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882463.78485: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15627 1726882463.78507: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15627 1726882463.78521: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15627 1726882463.78532: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 15627 1726882463.78542: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 15627 1726882463.78551: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 15627 1726882463.78566: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 15627 1726882463.78579: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 15627 1726882463.78596: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882463.78716: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882463.78811: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpxyr5dt03 /root/.ansible/tmp/ansible-tmp-1726882463.7033346-15800-157094926833464/AnsiballZ_setup.py <<< 15627 1726882463.78922: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882463.81329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882463.81431: stderr chunk (state=3): >>><<< 15627 1726882463.81434: stdout chunk (state=3): >>><<< 15627 1726882463.81451: done transferring module to remote 15627 1726882463.81467: _low_level_execute_command(): starting 15627 1726882463.81473: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882463.7033346-15800-157094926833464/ /root/.ansible/tmp/ansible-tmp-1726882463.7033346-15800-157094926833464/AnsiballZ_setup.py && sleep 0' 15627 1726882463.81915: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882463.81931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882463.81942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882463.81956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882463.82007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882463.82028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882463.82128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882463.84563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882463.84609: stderr chunk (state=3): >>><<< 15627 1726882463.84614: stdout chunk (state=3): >>><<< 15627 1726882463.84631: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15627 1726882463.84634: _low_level_execute_command(): starting 15627 1726882463.84638: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882463.7033346-15800-157094926833464/AnsiballZ_setup.py && sleep 0' 15627 1726882463.85062: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882463.85090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882463.85093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882463.85123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882463.85126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882463.85129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882463.85191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882463.85197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882463.85199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882463.85305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882464.52747: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_num<<< 15627 1726882464.52812: stdout chunk (state=3): >>>ber": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "24", "epoch": "1726882464", "epoch_int": "1726882464", "date": "2024-09-20", "time": "21:34:24", "iso8601_micro": "2024-09-21T01:34:24.244197Z", "iso8601": "2024-09-21T01:34:24Z", "iso8601_basic": "20240920T213424244197", "iso8601_basic_short": "20240920T213424", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.54, "5m": 0.38, "15m": 0.19}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2783, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 749, "free": 2783}, "nocache": {"free": 3244, "used": 288}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 622, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241164288, "block_size": 4096, "block_total": 65519355, "block_available": 64512003, "block_used": 1007352, "inode_total": 131071472, "inode_available": 130998695, "inode_used": 72777, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15627 1726882464.55141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882464.55144: stdout chunk (state=3): >>><<< 15627 1726882464.55146: stderr chunk (state=3): >>><<< 15627 1726882464.55376: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "24", "epoch": "1726882464", "epoch_int": "1726882464", "date": "2024-09-20", "time": "21:34:24", "iso8601_micro": "2024-09-21T01:34:24.244197Z", "iso8601": "2024-09-21T01:34:24Z", "iso8601_basic": "20240920T213424244197", "iso8601_basic_short": "20240920T213424", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.54, "5m": 0.38, "15m": 0.19}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2783, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 749, "free": 2783}, "nocache": {"free": 3244, "used": 288}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 622, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241164288, "block_size": 4096, "block_total": 65519355, "block_available": 64512003, "block_used": 1007352, "inode_total": 131071472, "inode_available": 130998695, "inode_used": 72777, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882464.55722: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882463.7033346-15800-157094926833464/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882464.55747: _low_level_execute_command(): starting 15627 1726882464.55757: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882463.7033346-15800-157094926833464/ > /dev/null 2>&1 && sleep 0' 15627 1726882464.56437: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882464.56451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882464.56479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882464.56534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882464.56589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882464.56638: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882464.56653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882464.56674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882464.56730: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882464.56743: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882464.56755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882464.56772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882464.57094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882464.57390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882464.57408: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882464.57984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882464.58061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882464.58093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882464.58111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882464.58336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882464.60750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882464.60822: stderr chunk (state=3): >>><<< 15627 1726882464.60827: stdout chunk (state=3): >>><<< 15627 1726882464.60974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15627 1726882464.60978: handler run complete 15627 1726882464.61083: variable 'ansible_facts' from source: unknown 15627 1726882464.61090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882464.62022: variable 'ansible_facts' from source: unknown 15627 1726882464.62196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882464.62471: attempt loop complete, returning result 15627 1726882464.62554: _execute() done 15627 1726882464.62562: dumping result to json 15627 1726882464.62603: done dumping result, returning 15627 1726882464.62667: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-2847-7723-0000000000da] 15627 1726882464.62678: sending task result for task 0e448fcc-3ce9-2847-7723-0000000000da ok: [managed_node1] 15627 1726882464.63712: no more pending results, returning what we have 15627 1726882464.63715: results queue empty 15627 1726882464.63716: checking for any_errors_fatal 15627 1726882464.63717: done checking for any_errors_fatal 15627 1726882464.63718: checking for max_fail_percentage 15627 1726882464.63720: done checking for max_fail_percentage 15627 1726882464.63720: checking to see if all hosts have failed and the running result is not ok 15627 1726882464.63721: done checking to see if all hosts have failed 15627 1726882464.63722: getting the remaining hosts for this loop 15627 1726882464.63724: done getting the remaining hosts for this loop 15627 1726882464.63727: getting the next task for host managed_node1 15627 1726882464.63735: done getting next task for host managed_node1 15627 1726882464.63736: ^ task is: TASK: meta (flush_handlers) 15627 1726882464.63738: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882464.63742: getting variables 15627 1726882464.63743: in VariableManager get_vars() 15627 1726882464.63768: Calling all_inventory to load vars for managed_node1 15627 1726882464.63771: Calling groups_inventory to load vars for managed_node1 15627 1726882464.63781: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882464.63793: Calling all_plugins_play to load vars for managed_node1 15627 1726882464.63796: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882464.63799: Calling groups_plugins_play to load vars for managed_node1 15627 1726882464.63957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882464.64276: done with get_vars() 15627 1726882464.64284: done getting variables 15627 1726882464.64540: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000000da 15627 1726882464.64543: WORKER PROCESS EXITING 15627 1726882464.64588: in VariableManager get_vars() 15627 1726882464.64597: Calling all_inventory to load vars for managed_node1 15627 1726882464.64599: Calling groups_inventory to load vars for managed_node1 15627 1726882464.64602: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882464.64606: Calling all_plugins_play to load vars for managed_node1 15627 1726882464.64608: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882464.64615: Calling groups_plugins_play to load vars for managed_node1 15627 1726882464.65085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882464.65419: done with get_vars() 15627 1726882464.65429: done queuing things up, now waiting for results queue to drain 15627 1726882464.65431: results queue empty 15627 1726882464.65432: checking for any_errors_fatal 15627 1726882464.65434: done checking for any_errors_fatal 15627 1726882464.65435: checking for max_fail_percentage 15627 1726882464.65436: done checking for max_fail_percentage 15627 1726882464.65437: checking to see if all hosts have failed and the running result is not ok 15627 1726882464.65437: done checking to see if all hosts have failed 15627 1726882464.65438: getting the remaining hosts for this loop 15627 1726882464.65439: done getting the remaining hosts for this loop 15627 1726882464.65441: getting the next task for host managed_node1 15627 1726882464.65445: done getting next task for host managed_node1 15627 1726882464.65446: ^ task is: TASK: Set interface={{ interface }} 15627 1726882464.65448: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882464.65450: getting variables 15627 1726882464.65450: in VariableManager get_vars() 15627 1726882464.65457: Calling all_inventory to load vars for managed_node1 15627 1726882464.65458: Calling groups_inventory to load vars for managed_node1 15627 1726882464.65460: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882464.65624: Calling all_plugins_play to load vars for managed_node1 15627 1726882464.65628: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882464.65631: Calling groups_plugins_play to load vars for managed_node1 15627 1726882464.66030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882464.66273: done with get_vars() 15627 1726882464.66308: done getting variables 15627 1726882464.66348: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882464.66474: variable 'interface' from source: play vars TASK [Set interface=LSR-TST-br31] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:9 Friday 20 September 2024 21:34:24 -0400 (0:00:01.024) 0:00:04.416 ****** 15627 1726882464.66522: entering _queue_task() for managed_node1/set_fact 15627 1726882464.66797: worker is 1 (out of 1 available) 15627 1726882464.66810: exiting _queue_task() for managed_node1/set_fact 15627 1726882464.66828: done queuing things up, now waiting for results queue to drain 15627 1726882464.66829: waiting for pending results... 15627 1726882464.67081: running TaskExecutor() for managed_node1/TASK: Set interface=LSR-TST-br31 15627 1726882464.67177: in run() - task 0e448fcc-3ce9-2847-7723-00000000000b 15627 1726882464.67196: variable 'ansible_search_path' from source: unknown 15627 1726882464.67233: calling self._execute() 15627 1726882464.67322: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882464.67333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882464.67347: variable 'omit' from source: magic vars 15627 1726882464.67802: variable 'ansible_distribution_major_version' from source: facts 15627 1726882464.67826: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882464.67838: variable 'omit' from source: magic vars 15627 1726882464.67868: variable 'omit' from source: magic vars 15627 1726882464.67899: variable 'interface' from source: play vars 15627 1726882464.67984: variable 'interface' from source: play vars 15627 1726882464.68005: variable 'omit' from source: magic vars 15627 1726882464.68056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882464.68096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882464.68120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882464.68152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882464.68171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882464.68204: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882464.68212: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882464.68219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882464.68330: Set connection var ansible_timeout to 10 15627 1726882464.68347: Set connection var ansible_shell_executable to /bin/sh 15627 1726882464.68370: Set connection var ansible_connection to ssh 15627 1726882464.68382: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882464.68394: Set connection var ansible_pipelining to False 15627 1726882464.68401: Set connection var ansible_shell_type to sh 15627 1726882464.68427: variable 'ansible_shell_executable' from source: unknown 15627 1726882464.68434: variable 'ansible_connection' from source: unknown 15627 1726882464.68440: variable 'ansible_module_compression' from source: unknown 15627 1726882464.68446: variable 'ansible_shell_type' from source: unknown 15627 1726882464.68452: variable 'ansible_shell_executable' from source: unknown 15627 1726882464.68464: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882464.68479: variable 'ansible_pipelining' from source: unknown 15627 1726882464.68485: variable 'ansible_timeout' from source: unknown 15627 1726882464.68492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882464.68631: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882464.68646: variable 'omit' from source: magic vars 15627 1726882464.68655: starting attempt loop 15627 1726882464.68661: running the handler 15627 1726882464.68681: handler run complete 15627 1726882464.68701: attempt loop complete, returning result 15627 1726882464.68708: _execute() done 15627 1726882464.68714: dumping result to json 15627 1726882464.68720: done dumping result, returning 15627 1726882464.68730: done running TaskExecutor() for managed_node1/TASK: Set interface=LSR-TST-br31 [0e448fcc-3ce9-2847-7723-00000000000b] 15627 1726882464.68739: sending task result for task 0e448fcc-3ce9-2847-7723-00000000000b ok: [managed_node1] => { "ansible_facts": { "interface": "LSR-TST-br31" }, "changed": false } 15627 1726882464.68879: no more pending results, returning what we have 15627 1726882464.68881: results queue empty 15627 1726882464.68882: checking for any_errors_fatal 15627 1726882464.68884: done checking for any_errors_fatal 15627 1726882464.68885: checking for max_fail_percentage 15627 1726882464.68887: done checking for max_fail_percentage 15627 1726882464.68887: checking to see if all hosts have failed and the running result is not ok 15627 1726882464.68889: done checking to see if all hosts have failed 15627 1726882464.68890: getting the remaining hosts for this loop 15627 1726882464.68891: done getting the remaining hosts for this loop 15627 1726882464.68895: getting the next task for host managed_node1 15627 1726882464.68902: done getting next task for host managed_node1 15627 1726882464.68905: ^ task is: TASK: Include the task 'show_interfaces.yml' 15627 1726882464.68907: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882464.68910: getting variables 15627 1726882464.68912: in VariableManager get_vars() 15627 1726882464.68941: Calling all_inventory to load vars for managed_node1 15627 1726882464.69007: Calling groups_inventory to load vars for managed_node1 15627 1726882464.69012: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882464.69022: Calling all_plugins_play to load vars for managed_node1 15627 1726882464.69025: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882464.69028: Calling groups_plugins_play to load vars for managed_node1 15627 1726882464.69198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882464.69407: done with get_vars() 15627 1726882464.69416: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:12 Friday 20 September 2024 21:34:24 -0400 (0:00:00.029) 0:00:04.446 ****** 15627 1726882464.69504: entering _queue_task() for managed_node1/include_tasks 15627 1726882464.69630: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000000b 15627 1726882464.69633: WORKER PROCESS EXITING 15627 1726882464.69968: worker is 1 (out of 1 available) 15627 1726882464.69980: exiting _queue_task() for managed_node1/include_tasks 15627 1726882464.69990: done queuing things up, now waiting for results queue to drain 15627 1726882464.69991: waiting for pending results... 15627 1726882464.70231: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 15627 1726882464.70329: in run() - task 0e448fcc-3ce9-2847-7723-00000000000c 15627 1726882464.70346: variable 'ansible_search_path' from source: unknown 15627 1726882464.70384: calling self._execute() 15627 1726882464.70462: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882464.70475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882464.70487: variable 'omit' from source: magic vars 15627 1726882464.70847: variable 'ansible_distribution_major_version' from source: facts 15627 1726882464.70867: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882464.70879: _execute() done 15627 1726882464.70885: dumping result to json 15627 1726882464.70892: done dumping result, returning 15627 1726882464.70900: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-2847-7723-00000000000c] 15627 1726882464.70909: sending task result for task 0e448fcc-3ce9-2847-7723-00000000000c 15627 1726882464.71025: no more pending results, returning what we have 15627 1726882464.71029: in VariableManager get_vars() 15627 1726882464.71065: Calling all_inventory to load vars for managed_node1 15627 1726882464.71068: Calling groups_inventory to load vars for managed_node1 15627 1726882464.71071: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882464.71083: Calling all_plugins_play to load vars for managed_node1 15627 1726882464.71086: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882464.71089: Calling groups_plugins_play to load vars for managed_node1 15627 1726882464.71268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882464.71467: done with get_vars() 15627 1726882464.71473: variable 'ansible_search_path' from source: unknown 15627 1726882464.71493: we have included files to process 15627 1726882464.71494: generating all_blocks data 15627 1726882464.71495: done generating all_blocks data 15627 1726882464.71496: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15627 1726882464.71497: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15627 1726882464.71499: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15627 1726882464.71814: in VariableManager get_vars() 15627 1726882464.71830: done with get_vars() 15627 1726882464.71919: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000000c 15627 1726882464.71922: WORKER PROCESS EXITING 15627 1726882464.72001: done processing included file 15627 1726882464.72003: iterating over new_blocks loaded from include file 15627 1726882464.72005: in VariableManager get_vars() 15627 1726882464.72015: done with get_vars() 15627 1726882464.72017: filtering new block on tags 15627 1726882464.72073: done filtering new block on tags 15627 1726882464.72075: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 15627 1726882464.72080: extending task lists for all hosts with included blocks 15627 1726882464.72145: done extending task lists 15627 1726882464.72146: done processing included files 15627 1726882464.72147: results queue empty 15627 1726882464.72148: checking for any_errors_fatal 15627 1726882464.72151: done checking for any_errors_fatal 15627 1726882464.72152: checking for max_fail_percentage 15627 1726882464.72153: done checking for max_fail_percentage 15627 1726882464.72154: checking to see if all hosts have failed and the running result is not ok 15627 1726882464.72155: done checking to see if all hosts have failed 15627 1726882464.72155: getting the remaining hosts for this loop 15627 1726882464.72156: done getting the remaining hosts for this loop 15627 1726882464.72159: getting the next task for host managed_node1 15627 1726882464.72162: done getting next task for host managed_node1 15627 1726882464.72166: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 15627 1726882464.72169: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882464.72171: getting variables 15627 1726882464.72172: in VariableManager get_vars() 15627 1726882464.72179: Calling all_inventory to load vars for managed_node1 15627 1726882464.72181: Calling groups_inventory to load vars for managed_node1 15627 1726882464.72183: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882464.72188: Calling all_plugins_play to load vars for managed_node1 15627 1726882464.72190: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882464.72192: Calling groups_plugins_play to load vars for managed_node1 15627 1726882464.72333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882464.72543: done with get_vars() 15627 1726882464.72551: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:34:24 -0400 (0:00:00.031) 0:00:04.477 ****** 15627 1726882464.72624: entering _queue_task() for managed_node1/include_tasks 15627 1726882464.72840: worker is 1 (out of 1 available) 15627 1726882464.72851: exiting _queue_task() for managed_node1/include_tasks 15627 1726882464.72861: done queuing things up, now waiting for results queue to drain 15627 1726882464.72862: waiting for pending results... 15627 1726882464.73101: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 15627 1726882464.73202: in run() - task 0e448fcc-3ce9-2847-7723-0000000000ee 15627 1726882464.73219: variable 'ansible_search_path' from source: unknown 15627 1726882464.73229: variable 'ansible_search_path' from source: unknown 15627 1726882464.73271: calling self._execute() 15627 1726882464.73356: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882464.73369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882464.73383: variable 'omit' from source: magic vars 15627 1726882464.73759: variable 'ansible_distribution_major_version' from source: facts 15627 1726882464.73784: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882464.73794: _execute() done 15627 1726882464.73801: dumping result to json 15627 1726882464.73808: done dumping result, returning 15627 1726882464.73817: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-2847-7723-0000000000ee] 15627 1726882464.73827: sending task result for task 0e448fcc-3ce9-2847-7723-0000000000ee 15627 1726882464.73931: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000000ee 15627 1726882464.73938: WORKER PROCESS EXITING 15627 1726882464.73980: no more pending results, returning what we have 15627 1726882464.73985: in VariableManager get_vars() 15627 1726882464.74015: Calling all_inventory to load vars for managed_node1 15627 1726882464.74018: Calling groups_inventory to load vars for managed_node1 15627 1726882464.74022: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882464.74036: Calling all_plugins_play to load vars for managed_node1 15627 1726882464.74039: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882464.74042: Calling groups_plugins_play to load vars for managed_node1 15627 1726882464.74260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882464.74479: done with get_vars() 15627 1726882464.74492: variable 'ansible_search_path' from source: unknown 15627 1726882464.74494: variable 'ansible_search_path' from source: unknown 15627 1726882464.74531: we have included files to process 15627 1726882464.74532: generating all_blocks data 15627 1726882464.74534: done generating all_blocks data 15627 1726882464.74535: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15627 1726882464.74536: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15627 1726882464.74538: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15627 1726882464.75036: done processing included file 15627 1726882464.75038: iterating over new_blocks loaded from include file 15627 1726882464.75039: in VariableManager get_vars() 15627 1726882464.75051: done with get_vars() 15627 1726882464.75052: filtering new block on tags 15627 1726882464.75073: done filtering new block on tags 15627 1726882464.75075: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 15627 1726882464.75079: extending task lists for all hosts with included blocks 15627 1726882464.75181: done extending task lists 15627 1726882464.75183: done processing included files 15627 1726882464.75184: results queue empty 15627 1726882464.75184: checking for any_errors_fatal 15627 1726882464.75187: done checking for any_errors_fatal 15627 1726882464.75187: checking for max_fail_percentage 15627 1726882464.75188: done checking for max_fail_percentage 15627 1726882464.75189: checking to see if all hosts have failed and the running result is not ok 15627 1726882464.75190: done checking to see if all hosts have failed 15627 1726882464.75191: getting the remaining hosts for this loop 15627 1726882464.75192: done getting the remaining hosts for this loop 15627 1726882464.75194: getting the next task for host managed_node1 15627 1726882464.75198: done getting next task for host managed_node1 15627 1726882464.75200: ^ task is: TASK: Gather current interface info 15627 1726882464.75203: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882464.75205: getting variables 15627 1726882464.75205: in VariableManager get_vars() 15627 1726882464.75212: Calling all_inventory to load vars for managed_node1 15627 1726882464.75214: Calling groups_inventory to load vars for managed_node1 15627 1726882464.75217: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882464.75221: Calling all_plugins_play to load vars for managed_node1 15627 1726882464.75223: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882464.75226: Calling groups_plugins_play to load vars for managed_node1 15627 1726882464.75379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882464.75582: done with get_vars() 15627 1726882464.75593: done getting variables 15627 1726882464.75628: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:34:24 -0400 (0:00:00.030) 0:00:04.508 ****** 15627 1726882464.75652: entering _queue_task() for managed_node1/command 15627 1726882464.75886: worker is 1 (out of 1 available) 15627 1726882464.75897: exiting _queue_task() for managed_node1/command 15627 1726882464.75914: done queuing things up, now waiting for results queue to drain 15627 1726882464.75915: waiting for pending results... 15627 1726882464.76168: running TaskExecutor() for managed_node1/TASK: Gather current interface info 15627 1726882464.76278: in run() - task 0e448fcc-3ce9-2847-7723-0000000000fd 15627 1726882464.76295: variable 'ansible_search_path' from source: unknown 15627 1726882464.76302: variable 'ansible_search_path' from source: unknown 15627 1726882464.76338: calling self._execute() 15627 1726882464.76416: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882464.76426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882464.76439: variable 'omit' from source: magic vars 15627 1726882464.76858: variable 'ansible_distribution_major_version' from source: facts 15627 1726882464.76878: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882464.76900: variable 'omit' from source: magic vars 15627 1726882464.76943: variable 'omit' from source: magic vars 15627 1726882464.76982: variable 'omit' from source: magic vars 15627 1726882464.77029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882464.77071: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882464.77093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882464.77124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882464.77138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882464.77173: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882464.77182: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882464.77189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882464.77292: Set connection var ansible_timeout to 10 15627 1726882464.77304: Set connection var ansible_shell_executable to /bin/sh 15627 1726882464.77312: Set connection var ansible_connection to ssh 15627 1726882464.77328: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882464.77340: Set connection var ansible_pipelining to False 15627 1726882464.77346: Set connection var ansible_shell_type to sh 15627 1726882464.77375: variable 'ansible_shell_executable' from source: unknown 15627 1726882464.77383: variable 'ansible_connection' from source: unknown 15627 1726882464.77390: variable 'ansible_module_compression' from source: unknown 15627 1726882464.77396: variable 'ansible_shell_type' from source: unknown 15627 1726882464.77402: variable 'ansible_shell_executable' from source: unknown 15627 1726882464.77408: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882464.77416: variable 'ansible_pipelining' from source: unknown 15627 1726882464.77423: variable 'ansible_timeout' from source: unknown 15627 1726882464.77439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882464.77591: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882464.77606: variable 'omit' from source: magic vars 15627 1726882464.77616: starting attempt loop 15627 1726882464.77623: running the handler 15627 1726882464.77640: _low_level_execute_command(): starting 15627 1726882464.77668: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882464.79117: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882464.79132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882464.79148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882464.79180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882464.79220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882464.79233: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882464.79247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882464.79281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882464.79295: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882464.79307: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882464.79320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882464.79334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882464.79350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882464.79368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882464.79389: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882464.79405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882464.79489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882464.79510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882464.79524: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882464.79721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882464.81262: stdout chunk (state=3): >>>/root <<< 15627 1726882464.81385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882464.81459: stderr chunk (state=3): >>><<< 15627 1726882464.81463: stdout chunk (state=3): >>><<< 15627 1726882464.81584: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882464.81588: _low_level_execute_command(): starting 15627 1726882464.81592: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882464.8148892-15857-106182998836904 `" && echo ansible-tmp-1726882464.8148892-15857-106182998836904="` echo /root/.ansible/tmp/ansible-tmp-1726882464.8148892-15857-106182998836904 `" ) && sleep 0' 15627 1726882464.82378: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882464.82386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882464.82397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882464.82410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882464.82447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882464.82456: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882464.82465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882464.82481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882464.82484: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882464.82492: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882464.82499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882464.82508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882464.82520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882464.82528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882464.82535: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882464.82544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882464.82617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882464.82630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882464.82641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882464.82773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882464.85000: stdout chunk (state=3): >>>ansible-tmp-1726882464.8148892-15857-106182998836904=/root/.ansible/tmp/ansible-tmp-1726882464.8148892-15857-106182998836904 <<< 15627 1726882464.85173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882464.85177: stderr chunk (state=3): >>><<< 15627 1726882464.85179: stdout chunk (state=3): >>><<< 15627 1726882464.85195: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882464.8148892-15857-106182998836904=/root/.ansible/tmp/ansible-tmp-1726882464.8148892-15857-106182998836904 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15627 1726882464.85227: variable 'ansible_module_compression' from source: unknown 15627 1726882464.85282: ANSIBALLZ: Using generic lock for ansible.legacy.command 15627 1726882464.85285: ANSIBALLZ: Acquiring lock 15627 1726882464.85290: ANSIBALLZ: Lock acquired: 140251854220672 15627 1726882464.85292: ANSIBALLZ: Creating module 15627 1726882465.01242: ANSIBALLZ: Writing module into payload 15627 1726882465.01325: ANSIBALLZ: Writing module 15627 1726882465.01343: ANSIBALLZ: Renaming module 15627 1726882465.01348: ANSIBALLZ: Done creating module 15627 1726882465.01366: variable 'ansible_facts' from source: unknown 15627 1726882465.01410: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882464.8148892-15857-106182998836904/AnsiballZ_command.py 15627 1726882465.01513: Sending initial data 15627 1726882465.01516: Sent initial data (156 bytes) 15627 1726882465.02221: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882465.02224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.02227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882465.02230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.02269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882465.02272: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882465.02386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.02389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882465.02391: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882465.02393: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882465.02395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.02397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882465.02399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.02401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882465.02403: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882465.02405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.02444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882465.02466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882465.02478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882465.02616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882465.05126: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882465.05222: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882465.05322: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpy6fm8h4k /root/.ansible/tmp/ansible-tmp-1726882464.8148892-15857-106182998836904/AnsiballZ_command.py <<< 15627 1726882465.05705: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882465.07219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882465.07340: stderr chunk (state=3): >>><<< 15627 1726882465.07343: stdout chunk (state=3): >>><<< 15627 1726882465.07346: done transferring module to remote 15627 1726882465.07348: _low_level_execute_command(): starting 15627 1726882465.07350: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882464.8148892-15857-106182998836904/ /root/.ansible/tmp/ansible-tmp-1726882464.8148892-15857-106182998836904/AnsiballZ_command.py && sleep 0' 15627 1726882465.09121: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.09124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.09166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882465.09169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.09172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882465.09175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.09240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882465.09245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882465.09248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882465.09357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882465.11784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882465.11857: stderr chunk (state=3): >>><<< 15627 1726882465.11860: stdout chunk (state=3): >>><<< 15627 1726882465.11959: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15627 1726882465.11962: _low_level_execute_command(): starting 15627 1726882465.11967: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882464.8148892-15857-106182998836904/AnsiballZ_command.py && sleep 0' 15627 1726882465.13388: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.13391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.13433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882465.13436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.13438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882465.13440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.13623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882465.13627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882465.13629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882465.13746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15627 1726882465.30916: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:34:25.303239", "end": "2024-09-20 21:34:25.307638", "delta": "0:00:00.004399", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15627 1726882465.32296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882465.32363: stderr chunk (state=3): >>><<< 15627 1726882465.32370: stdout chunk (state=3): >>><<< 15627 1726882465.32510: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:34:25.303239", "end": "2024-09-20 21:34:25.307638", "delta": "0:00:00.004399", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882465.32522: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882464.8148892-15857-106182998836904/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882465.32525: _low_level_execute_command(): starting 15627 1726882465.32527: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882464.8148892-15857-106182998836904/ > /dev/null 2>&1 && sleep 0' 15627 1726882465.35159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.35163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.35202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882465.35205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.35207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882465.35210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.35275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882465.35697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882465.35788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882465.37650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882465.37653: stdout chunk (state=3): >>><<< 15627 1726882465.37656: stderr chunk (state=3): >>><<< 15627 1726882465.37974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882465.37978: handler run complete 15627 1726882465.37980: Evaluated conditional (False): False 15627 1726882465.37982: attempt loop complete, returning result 15627 1726882465.37984: _execute() done 15627 1726882465.37986: dumping result to json 15627 1726882465.37988: done dumping result, returning 15627 1726882465.37990: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0e448fcc-3ce9-2847-7723-0000000000fd] 15627 1726882465.37992: sending task result for task 0e448fcc-3ce9-2847-7723-0000000000fd 15627 1726882465.38060: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000000fd 15627 1726882465.38066: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004399", "end": "2024-09-20 21:34:25.307638", "rc": 0, "start": "2024-09-20 21:34:25.303239" } STDOUT: bonding_masters eth0 lo 15627 1726882465.38216: no more pending results, returning what we have 15627 1726882465.38219: results queue empty 15627 1726882465.38220: checking for any_errors_fatal 15627 1726882465.38221: done checking for any_errors_fatal 15627 1726882465.38222: checking for max_fail_percentage 15627 1726882465.38224: done checking for max_fail_percentage 15627 1726882465.38225: checking to see if all hosts have failed and the running result is not ok 15627 1726882465.38226: done checking to see if all hosts have failed 15627 1726882465.38226: getting the remaining hosts for this loop 15627 1726882465.38228: done getting the remaining hosts for this loop 15627 1726882465.38231: getting the next task for host managed_node1 15627 1726882465.38238: done getting next task for host managed_node1 15627 1726882465.38241: ^ task is: TASK: Set current_interfaces 15627 1726882465.38245: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882465.38247: getting variables 15627 1726882465.38249: in VariableManager get_vars() 15627 1726882465.38276: Calling all_inventory to load vars for managed_node1 15627 1726882465.38278: Calling groups_inventory to load vars for managed_node1 15627 1726882465.38282: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882465.38292: Calling all_plugins_play to load vars for managed_node1 15627 1726882465.38295: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882465.38298: Calling groups_plugins_play to load vars for managed_node1 15627 1726882465.38453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882465.38739: done with get_vars() 15627 1726882465.38749: done getting variables 15627 1726882465.38893: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:34:25 -0400 (0:00:00.632) 0:00:05.140 ****** 15627 1726882465.38932: entering _queue_task() for managed_node1/set_fact 15627 1726882465.39472: worker is 1 (out of 1 available) 15627 1726882465.39484: exiting _queue_task() for managed_node1/set_fact 15627 1726882465.39494: done queuing things up, now waiting for results queue to drain 15627 1726882465.39496: waiting for pending results... 15627 1726882465.40088: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 15627 1726882465.40203: in run() - task 0e448fcc-3ce9-2847-7723-0000000000fe 15627 1726882465.40230: variable 'ansible_search_path' from source: unknown 15627 1726882465.40238: variable 'ansible_search_path' from source: unknown 15627 1726882465.40281: calling self._execute() 15627 1726882465.40369: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.40380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.40394: variable 'omit' from source: magic vars 15627 1726882465.40755: variable 'ansible_distribution_major_version' from source: facts 15627 1726882465.40782: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882465.40793: variable 'omit' from source: magic vars 15627 1726882465.40842: variable 'omit' from source: magic vars 15627 1726882465.40955: variable '_current_interfaces' from source: set_fact 15627 1726882465.41033: variable 'omit' from source: magic vars 15627 1726882465.41082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882465.41133: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882465.41186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882465.41217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882465.41234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882465.41270: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882465.41279: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.41287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.41430: Set connection var ansible_timeout to 10 15627 1726882465.41445: Set connection var ansible_shell_executable to /bin/sh 15627 1726882465.41454: Set connection var ansible_connection to ssh 15627 1726882465.41465: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882465.41477: Set connection var ansible_pipelining to False 15627 1726882465.41485: Set connection var ansible_shell_type to sh 15627 1726882465.41512: variable 'ansible_shell_executable' from source: unknown 15627 1726882465.41521: variable 'ansible_connection' from source: unknown 15627 1726882465.41538: variable 'ansible_module_compression' from source: unknown 15627 1726882465.41546: variable 'ansible_shell_type' from source: unknown 15627 1726882465.41554: variable 'ansible_shell_executable' from source: unknown 15627 1726882465.41561: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.41570: variable 'ansible_pipelining' from source: unknown 15627 1726882465.41577: variable 'ansible_timeout' from source: unknown 15627 1726882465.41584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.41735: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882465.41764: variable 'omit' from source: magic vars 15627 1726882465.41776: starting attempt loop 15627 1726882465.41784: running the handler 15627 1726882465.41799: handler run complete 15627 1726882465.41814: attempt loop complete, returning result 15627 1726882465.41821: _execute() done 15627 1726882465.41828: dumping result to json 15627 1726882465.41835: done dumping result, returning 15627 1726882465.41847: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0e448fcc-3ce9-2847-7723-0000000000fe] 15627 1726882465.41869: sending task result for task 0e448fcc-3ce9-2847-7723-0000000000fe ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 15627 1726882465.42021: no more pending results, returning what we have 15627 1726882465.42025: results queue empty 15627 1726882465.42026: checking for any_errors_fatal 15627 1726882465.42033: done checking for any_errors_fatal 15627 1726882465.42034: checking for max_fail_percentage 15627 1726882465.42036: done checking for max_fail_percentage 15627 1726882465.42036: checking to see if all hosts have failed and the running result is not ok 15627 1726882465.42038: done checking to see if all hosts have failed 15627 1726882465.42038: getting the remaining hosts for this loop 15627 1726882465.42040: done getting the remaining hosts for this loop 15627 1726882465.42044: getting the next task for host managed_node1 15627 1726882465.42053: done getting next task for host managed_node1 15627 1726882465.42056: ^ task is: TASK: Show current_interfaces 15627 1726882465.42059: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882465.42065: getting variables 15627 1726882465.42067: in VariableManager get_vars() 15627 1726882465.42098: Calling all_inventory to load vars for managed_node1 15627 1726882465.42102: Calling groups_inventory to load vars for managed_node1 15627 1726882465.42106: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882465.42117: Calling all_plugins_play to load vars for managed_node1 15627 1726882465.42120: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882465.42123: Calling groups_plugins_play to load vars for managed_node1 15627 1726882465.42329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882465.42576: done with get_vars() 15627 1726882465.42597: done getting variables 15627 1726882465.42727: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000000fe 15627 1726882465.42731: WORKER PROCESS EXITING 15627 1726882465.42794: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:34:25 -0400 (0:00:00.038) 0:00:05.179 ****** 15627 1726882465.42946: entering _queue_task() for managed_node1/debug 15627 1726882465.42948: Creating lock for debug 15627 1726882465.43439: worker is 1 (out of 1 available) 15627 1726882465.43451: exiting _queue_task() for managed_node1/debug 15627 1726882465.43465: done queuing things up, now waiting for results queue to drain 15627 1726882465.43466: waiting for pending results... 15627 1726882465.44516: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 15627 1726882465.44723: in run() - task 0e448fcc-3ce9-2847-7723-0000000000ef 15627 1726882465.44739: variable 'ansible_search_path' from source: unknown 15627 1726882465.44746: variable 'ansible_search_path' from source: unknown 15627 1726882465.44788: calling self._execute() 15627 1726882465.44886: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.44978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.45030: variable 'omit' from source: magic vars 15627 1726882465.45802: variable 'ansible_distribution_major_version' from source: facts 15627 1726882465.45907: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882465.45920: variable 'omit' from source: magic vars 15627 1726882465.45960: variable 'omit' from source: magic vars 15627 1726882465.46177: variable 'current_interfaces' from source: set_fact 15627 1726882465.46208: variable 'omit' from source: magic vars 15627 1726882465.46370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882465.46410: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882465.46443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882465.46519: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882465.46539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882465.46751: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882465.46781: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.46793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.47013: Set connection var ansible_timeout to 10 15627 1726882465.47025: Set connection var ansible_shell_executable to /bin/sh 15627 1726882465.47033: Set connection var ansible_connection to ssh 15627 1726882465.47041: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882465.47049: Set connection var ansible_pipelining to False 15627 1726882465.47054: Set connection var ansible_shell_type to sh 15627 1726882465.47086: variable 'ansible_shell_executable' from source: unknown 15627 1726882465.47175: variable 'ansible_connection' from source: unknown 15627 1726882465.47183: variable 'ansible_module_compression' from source: unknown 15627 1726882465.47190: variable 'ansible_shell_type' from source: unknown 15627 1726882465.47205: variable 'ansible_shell_executable' from source: unknown 15627 1726882465.47217: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.47225: variable 'ansible_pipelining' from source: unknown 15627 1726882465.47231: variable 'ansible_timeout' from source: unknown 15627 1726882465.47239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.47565: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882465.47584: variable 'omit' from source: magic vars 15627 1726882465.47595: starting attempt loop 15627 1726882465.47602: running the handler 15627 1726882465.47775: handler run complete 15627 1726882465.47794: attempt loop complete, returning result 15627 1726882465.47802: _execute() done 15627 1726882465.47808: dumping result to json 15627 1726882465.47815: done dumping result, returning 15627 1726882465.47827: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0e448fcc-3ce9-2847-7723-0000000000ef] 15627 1726882465.47835: sending task result for task 0e448fcc-3ce9-2847-7723-0000000000ef ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 15627 1726882465.48016: no more pending results, returning what we have 15627 1726882465.48019: results queue empty 15627 1726882465.48020: checking for any_errors_fatal 15627 1726882465.48024: done checking for any_errors_fatal 15627 1726882465.48025: checking for max_fail_percentage 15627 1726882465.48027: done checking for max_fail_percentage 15627 1726882465.48028: checking to see if all hosts have failed and the running result is not ok 15627 1726882465.48029: done checking to see if all hosts have failed 15627 1726882465.48030: getting the remaining hosts for this loop 15627 1726882465.48031: done getting the remaining hosts for this loop 15627 1726882465.48035: getting the next task for host managed_node1 15627 1726882465.48044: done getting next task for host managed_node1 15627 1726882465.48047: ^ task is: TASK: Include the task 'assert_device_absent.yml' 15627 1726882465.48049: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882465.48052: getting variables 15627 1726882465.48053: in VariableManager get_vars() 15627 1726882465.48084: Calling all_inventory to load vars for managed_node1 15627 1726882465.48087: Calling groups_inventory to load vars for managed_node1 15627 1726882465.48091: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882465.48101: Calling all_plugins_play to load vars for managed_node1 15627 1726882465.48104: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882465.48106: Calling groups_plugins_play to load vars for managed_node1 15627 1726882465.48293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882465.48488: done with get_vars() 15627 1726882465.48503: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:14 Friday 20 September 2024 21:34:25 -0400 (0:00:00.056) 0:00:05.237 ****** 15627 1726882465.48621: entering _queue_task() for managed_node1/include_tasks 15627 1726882465.48639: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000000ef 15627 1726882465.48657: WORKER PROCESS EXITING 15627 1726882465.49362: worker is 1 (out of 1 available) 15627 1726882465.49492: exiting _queue_task() for managed_node1/include_tasks 15627 1726882465.49503: done queuing things up, now waiting for results queue to drain 15627 1726882465.49504: waiting for pending results... 15627 1726882465.50353: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 15627 1726882465.50652: in run() - task 0e448fcc-3ce9-2847-7723-00000000000d 15627 1726882465.50687: variable 'ansible_search_path' from source: unknown 15627 1726882465.50733: calling self._execute() 15627 1726882465.50947: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.50960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.50975: variable 'omit' from source: magic vars 15627 1726882465.51797: variable 'ansible_distribution_major_version' from source: facts 15627 1726882465.51815: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882465.51826: _execute() done 15627 1726882465.51833: dumping result to json 15627 1726882465.51841: done dumping result, returning 15627 1726882465.51850: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [0e448fcc-3ce9-2847-7723-00000000000d] 15627 1726882465.51866: sending task result for task 0e448fcc-3ce9-2847-7723-00000000000d 15627 1726882465.52033: no more pending results, returning what we have 15627 1726882465.52038: in VariableManager get_vars() 15627 1726882465.52080: Calling all_inventory to load vars for managed_node1 15627 1726882465.52084: Calling groups_inventory to load vars for managed_node1 15627 1726882465.52088: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882465.52105: Calling all_plugins_play to load vars for managed_node1 15627 1726882465.52109: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882465.52112: Calling groups_plugins_play to load vars for managed_node1 15627 1726882465.52346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882465.52542: done with get_vars() 15627 1726882465.52548: variable 'ansible_search_path' from source: unknown 15627 1726882465.52568: we have included files to process 15627 1726882465.52570: generating all_blocks data 15627 1726882465.52572: done generating all_blocks data 15627 1726882465.52578: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15627 1726882465.52579: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15627 1726882465.52582: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15627 1726882465.53099: in VariableManager get_vars() 15627 1726882465.53116: done with get_vars() 15627 1726882465.53270: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000000d 15627 1726882465.53274: WORKER PROCESS EXITING 15627 1726882465.53674: done processing included file 15627 1726882465.53677: iterating over new_blocks loaded from include file 15627 1726882465.53679: in VariableManager get_vars() 15627 1726882465.53690: done with get_vars() 15627 1726882465.53691: filtering new block on tags 15627 1726882465.53708: done filtering new block on tags 15627 1726882465.53710: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 15627 1726882465.53715: extending task lists for all hosts with included blocks 15627 1726882465.53860: done extending task lists 15627 1726882465.53861: done processing included files 15627 1726882465.53862: results queue empty 15627 1726882465.53863: checking for any_errors_fatal 15627 1726882465.53971: done checking for any_errors_fatal 15627 1726882465.53973: checking for max_fail_percentage 15627 1726882465.53974: done checking for max_fail_percentage 15627 1726882465.53975: checking to see if all hosts have failed and the running result is not ok 15627 1726882465.53976: done checking to see if all hosts have failed 15627 1726882465.53976: getting the remaining hosts for this loop 15627 1726882465.53978: done getting the remaining hosts for this loop 15627 1726882465.53981: getting the next task for host managed_node1 15627 1726882465.53985: done getting next task for host managed_node1 15627 1726882465.53987: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15627 1726882465.53990: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882465.53992: getting variables 15627 1726882465.53993: in VariableManager get_vars() 15627 1726882465.54001: Calling all_inventory to load vars for managed_node1 15627 1726882465.54003: Calling groups_inventory to load vars for managed_node1 15627 1726882465.54006: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882465.54011: Calling all_plugins_play to load vars for managed_node1 15627 1726882465.54013: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882465.54016: Calling groups_plugins_play to load vars for managed_node1 15627 1726882465.54154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882465.54658: done with get_vars() 15627 1726882465.54677: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:34:25 -0400 (0:00:00.061) 0:00:05.299 ****** 15627 1726882465.54767: entering _queue_task() for managed_node1/include_tasks 15627 1726882465.55026: worker is 1 (out of 1 available) 15627 1726882465.55037: exiting _queue_task() for managed_node1/include_tasks 15627 1726882465.55048: done queuing things up, now waiting for results queue to drain 15627 1726882465.55049: waiting for pending results... 15627 1726882465.55687: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 15627 1726882465.55797: in run() - task 0e448fcc-3ce9-2847-7723-000000000119 15627 1726882465.55817: variable 'ansible_search_path' from source: unknown 15627 1726882465.55821: variable 'ansible_search_path' from source: unknown 15627 1726882465.55862: calling self._execute() 15627 1726882465.55942: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.55960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.55978: variable 'omit' from source: magic vars 15627 1726882465.56424: variable 'ansible_distribution_major_version' from source: facts 15627 1726882465.56440: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882465.56450: _execute() done 15627 1726882465.56460: dumping result to json 15627 1726882465.56472: done dumping result, returning 15627 1726882465.56483: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-2847-7723-000000000119] 15627 1726882465.56498: sending task result for task 0e448fcc-3ce9-2847-7723-000000000119 15627 1726882465.56662: no more pending results, returning what we have 15627 1726882465.56672: in VariableManager get_vars() 15627 1726882465.56705: Calling all_inventory to load vars for managed_node1 15627 1726882465.56708: Calling groups_inventory to load vars for managed_node1 15627 1726882465.56712: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882465.56726: Calling all_plugins_play to load vars for managed_node1 15627 1726882465.56729: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882465.56732: Calling groups_plugins_play to load vars for managed_node1 15627 1726882465.56937: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000119 15627 1726882465.56941: WORKER PROCESS EXITING 15627 1726882465.56984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882465.57175: done with get_vars() 15627 1726882465.57182: variable 'ansible_search_path' from source: unknown 15627 1726882465.57183: variable 'ansible_search_path' from source: unknown 15627 1726882465.57219: we have included files to process 15627 1726882465.57220: generating all_blocks data 15627 1726882465.57222: done generating all_blocks data 15627 1726882465.57223: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15627 1726882465.57224: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15627 1726882465.57225: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15627 1726882465.57437: done processing included file 15627 1726882465.57439: iterating over new_blocks loaded from include file 15627 1726882465.57440: in VariableManager get_vars() 15627 1726882465.57454: done with get_vars() 15627 1726882465.57455: filtering new block on tags 15627 1726882465.57472: done filtering new block on tags 15627 1726882465.57474: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 15627 1726882465.57481: extending task lists for all hosts with included blocks 15627 1726882465.57627: done extending task lists 15627 1726882465.57629: done processing included files 15627 1726882465.57629: results queue empty 15627 1726882465.57630: checking for any_errors_fatal 15627 1726882465.57632: done checking for any_errors_fatal 15627 1726882465.57633: checking for max_fail_percentage 15627 1726882465.57634: done checking for max_fail_percentage 15627 1726882465.57635: checking to see if all hosts have failed and the running result is not ok 15627 1726882465.57636: done checking to see if all hosts have failed 15627 1726882465.57637: getting the remaining hosts for this loop 15627 1726882465.57638: done getting the remaining hosts for this loop 15627 1726882465.57640: getting the next task for host managed_node1 15627 1726882465.57644: done getting next task for host managed_node1 15627 1726882465.57646: ^ task is: TASK: Get stat for interface {{ interface }} 15627 1726882465.57649: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882465.57651: getting variables 15627 1726882465.57652: in VariableManager get_vars() 15627 1726882465.57660: Calling all_inventory to load vars for managed_node1 15627 1726882465.57662: Calling groups_inventory to load vars for managed_node1 15627 1726882465.57667: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882465.57672: Calling all_plugins_play to load vars for managed_node1 15627 1726882465.57674: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882465.57676: Calling groups_plugins_play to load vars for managed_node1 15627 1726882465.57817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882465.58030: done with get_vars() 15627 1726882465.58036: done getting variables 15627 1726882465.58167: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:34:25 -0400 (0:00:00.034) 0:00:05.333 ****** 15627 1726882465.58218: entering _queue_task() for managed_node1/stat 15627 1726882465.58468: worker is 1 (out of 1 available) 15627 1726882465.58481: exiting _queue_task() for managed_node1/stat 15627 1726882465.58492: done queuing things up, now waiting for results queue to drain 15627 1726882465.58494: waiting for pending results... 15627 1726882465.58750: running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 15627 1726882465.58865: in run() - task 0e448fcc-3ce9-2847-7723-000000000133 15627 1726882465.58882: variable 'ansible_search_path' from source: unknown 15627 1726882465.58889: variable 'ansible_search_path' from source: unknown 15627 1726882465.58931: calling self._execute() 15627 1726882465.59012: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.59026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.59069: variable 'omit' from source: magic vars 15627 1726882465.59451: variable 'ansible_distribution_major_version' from source: facts 15627 1726882465.59472: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882465.59480: variable 'omit' from source: magic vars 15627 1726882465.59519: variable 'omit' from source: magic vars 15627 1726882465.59592: variable 'interface' from source: set_fact 15627 1726882465.59610: variable 'omit' from source: magic vars 15627 1726882465.59639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882465.59667: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882465.59684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882465.59697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882465.59709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882465.59732: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882465.59735: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.59737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.59805: Set connection var ansible_timeout to 10 15627 1726882465.59817: Set connection var ansible_shell_executable to /bin/sh 15627 1726882465.59820: Set connection var ansible_connection to ssh 15627 1726882465.59822: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882465.59826: Set connection var ansible_pipelining to False 15627 1726882465.59829: Set connection var ansible_shell_type to sh 15627 1726882465.59846: variable 'ansible_shell_executable' from source: unknown 15627 1726882465.59850: variable 'ansible_connection' from source: unknown 15627 1726882465.59852: variable 'ansible_module_compression' from source: unknown 15627 1726882465.59857: variable 'ansible_shell_type' from source: unknown 15627 1726882465.59859: variable 'ansible_shell_executable' from source: unknown 15627 1726882465.59862: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.59867: variable 'ansible_pipelining' from source: unknown 15627 1726882465.59869: variable 'ansible_timeout' from source: unknown 15627 1726882465.59871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.60009: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882465.60016: variable 'omit' from source: magic vars 15627 1726882465.60021: starting attempt loop 15627 1726882465.60024: running the handler 15627 1726882465.60036: _low_level_execute_command(): starting 15627 1726882465.60043: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882465.60533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882465.60543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.60585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882465.60595: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.60630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882465.60643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882465.60751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882465.62433: stdout chunk (state=3): >>>/root <<< 15627 1726882465.62593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882465.62604: stdout chunk (state=3): >>><<< 15627 1726882465.62622: stderr chunk (state=3): >>><<< 15627 1726882465.62645: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882465.62670: _low_level_execute_command(): starting 15627 1726882465.62688: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882465.62651-15908-246236285984441 `" && echo ansible-tmp-1726882465.62651-15908-246236285984441="` echo /root/.ansible/tmp/ansible-tmp-1726882465.62651-15908-246236285984441 `" ) && sleep 0' 15627 1726882465.63343: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882465.63360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.63382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882465.63402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.63440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882465.63469: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882465.63505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.63526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882465.63551: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882465.63589: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882465.63612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.63625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882465.63641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.63657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882465.63688: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882465.63714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.63768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882465.63802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882465.63806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882465.63924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882465.65840: stdout chunk (state=3): >>>ansible-tmp-1726882465.62651-15908-246236285984441=/root/.ansible/tmp/ansible-tmp-1726882465.62651-15908-246236285984441 <<< 15627 1726882465.65960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882465.66034: stderr chunk (state=3): >>><<< 15627 1726882465.66045: stdout chunk (state=3): >>><<< 15627 1726882465.66280: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882465.62651-15908-246236285984441=/root/.ansible/tmp/ansible-tmp-1726882465.62651-15908-246236285984441 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882465.66283: variable 'ansible_module_compression' from source: unknown 15627 1726882465.66285: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15627 1726882465.66287: variable 'ansible_facts' from source: unknown 15627 1726882465.66324: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882465.62651-15908-246236285984441/AnsiballZ_stat.py 15627 1726882465.66482: Sending initial data 15627 1726882465.66490: Sent initial data (151 bytes) 15627 1726882465.67672: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882465.67675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.67678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882465.67680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.67682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882465.67684: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882465.67686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.67688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882465.67690: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882465.67692: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882465.67694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.67696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882465.67697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.67699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882465.67701: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882465.67703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.67786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882465.67790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882465.67793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882465.67889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882465.69614: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15627 1726882465.69624: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15627 1726882465.69632: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15627 1726882465.69639: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 15627 1726882465.69646: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 15627 1726882465.69653: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 15627 1726882465.69662: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 15627 1726882465.69674: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 15627 1726882465.69680: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882465.69778: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 15627 1726882465.69786: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 15627 1726882465.69793: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 15627 1726882465.69894: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmp3wt7u1bl /root/.ansible/tmp/ansible-tmp-1726882465.62651-15908-246236285984441/AnsiballZ_stat.py <<< 15627 1726882465.69992: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882465.71010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882465.71151: stderr chunk (state=3): >>><<< 15627 1726882465.71159: stdout chunk (state=3): >>><<< 15627 1726882465.71180: done transferring module to remote 15627 1726882465.71191: _low_level_execute_command(): starting 15627 1726882465.71194: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882465.62651-15908-246236285984441/ /root/.ansible/tmp/ansible-tmp-1726882465.62651-15908-246236285984441/AnsiballZ_stat.py && sleep 0' 15627 1726882465.71887: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882465.71892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.71902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882465.71916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.71962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882465.71978: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882465.71981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.71999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882465.72010: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882465.72018: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882465.72023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.72041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882465.72051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.72058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882465.72067: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882465.72076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.72141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882465.72157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882465.72179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882465.72280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882465.74008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882465.74056: stderr chunk (state=3): >>><<< 15627 1726882465.74059: stdout chunk (state=3): >>><<< 15627 1726882465.74119: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882465.74122: _low_level_execute_command(): starting 15627 1726882465.74125: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882465.62651-15908-246236285984441/AnsiballZ_stat.py && sleep 0' 15627 1726882465.74827: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882465.74830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.74835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882465.74847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.74872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882465.74877: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882465.74884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.74908: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.74930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.75003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882465.75016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882465.75020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882465.75142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882465.88166: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15627 1726882465.89206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882465.89411: stderr chunk (state=3): >>><<< 15627 1726882465.89419: stdout chunk (state=3): >>><<< 15627 1726882465.89476: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882465.89639: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882465.62651-15908-246236285984441/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882465.89648: _low_level_execute_command(): starting 15627 1726882465.89650: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882465.62651-15908-246236285984441/ > /dev/null 2>&1 && sleep 0' 15627 1726882465.90265: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882465.90301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882465.90304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882465.90358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882465.90361: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.90367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882465.90369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882465.90413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882465.90426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882465.90534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882465.92436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882465.92440: stdout chunk (state=3): >>><<< 15627 1726882465.92442: stderr chunk (state=3): >>><<< 15627 1726882465.92570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882465.92573: handler run complete 15627 1726882465.92575: attempt loop complete, returning result 15627 1726882465.92577: _execute() done 15627 1726882465.92579: dumping result to json 15627 1726882465.92581: done dumping result, returning 15627 1726882465.92583: done running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 [0e448fcc-3ce9-2847-7723-000000000133] 15627 1726882465.92585: sending task result for task 0e448fcc-3ce9-2847-7723-000000000133 15627 1726882465.92661: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000133 15627 1726882465.92667: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15627 1726882465.92733: no more pending results, returning what we have 15627 1726882465.92741: results queue empty 15627 1726882465.92742: checking for any_errors_fatal 15627 1726882465.92744: done checking for any_errors_fatal 15627 1726882465.92745: checking for max_fail_percentage 15627 1726882465.92747: done checking for max_fail_percentage 15627 1726882465.92748: checking to see if all hosts have failed and the running result is not ok 15627 1726882465.92749: done checking to see if all hosts have failed 15627 1726882465.92750: getting the remaining hosts for this loop 15627 1726882465.92751: done getting the remaining hosts for this loop 15627 1726882465.92758: getting the next task for host managed_node1 15627 1726882465.92770: done getting next task for host managed_node1 15627 1726882465.92773: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15627 1726882465.92776: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882465.92780: getting variables 15627 1726882465.92782: in VariableManager get_vars() 15627 1726882465.92820: Calling all_inventory to load vars for managed_node1 15627 1726882465.92823: Calling groups_inventory to load vars for managed_node1 15627 1726882465.92826: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882465.92844: Calling all_plugins_play to load vars for managed_node1 15627 1726882465.92851: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882465.92858: Calling groups_plugins_play to load vars for managed_node1 15627 1726882465.93328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882465.93570: done with get_vars() 15627 1726882465.93581: done getting variables 15627 1726882465.93682: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 15627 1726882465.93813: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:34:25 -0400 (0:00:00.356) 0:00:05.690 ****** 15627 1726882465.93842: entering _queue_task() for managed_node1/assert 15627 1726882465.93844: Creating lock for assert 15627 1726882465.94113: worker is 1 (out of 1 available) 15627 1726882465.94133: exiting _queue_task() for managed_node1/assert 15627 1726882465.94146: done queuing things up, now waiting for results queue to drain 15627 1726882465.94148: waiting for pending results... 15627 1726882465.94938: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15627 1726882465.95044: in run() - task 0e448fcc-3ce9-2847-7723-00000000011a 15627 1726882465.95053: variable 'ansible_search_path' from source: unknown 15627 1726882465.95061: variable 'ansible_search_path' from source: unknown 15627 1726882465.95094: calling self._execute() 15627 1726882465.95174: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.95179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.95188: variable 'omit' from source: magic vars 15627 1726882465.95451: variable 'ansible_distribution_major_version' from source: facts 15627 1726882465.95466: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882465.95471: variable 'omit' from source: magic vars 15627 1726882465.95540: variable 'omit' from source: magic vars 15627 1726882465.95607: variable 'interface' from source: set_fact 15627 1726882465.95626: variable 'omit' from source: magic vars 15627 1726882465.95657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882465.95695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882465.95722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882465.95800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882465.95809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882465.95832: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882465.95835: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.95837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.95926: Set connection var ansible_timeout to 10 15627 1726882465.95932: Set connection var ansible_shell_executable to /bin/sh 15627 1726882465.95937: Set connection var ansible_connection to ssh 15627 1726882465.95942: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882465.95947: Set connection var ansible_pipelining to False 15627 1726882465.95950: Set connection var ansible_shell_type to sh 15627 1726882465.95971: variable 'ansible_shell_executable' from source: unknown 15627 1726882465.95976: variable 'ansible_connection' from source: unknown 15627 1726882465.95979: variable 'ansible_module_compression' from source: unknown 15627 1726882465.96005: variable 'ansible_shell_type' from source: unknown 15627 1726882465.96010: variable 'ansible_shell_executable' from source: unknown 15627 1726882465.96012: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882465.96014: variable 'ansible_pipelining' from source: unknown 15627 1726882465.96016: variable 'ansible_timeout' from source: unknown 15627 1726882465.96019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882465.96173: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882465.96177: variable 'omit' from source: magic vars 15627 1726882465.96202: starting attempt loop 15627 1726882465.96207: running the handler 15627 1726882465.96281: variable 'interface_stat' from source: set_fact 15627 1726882465.96288: Evaluated conditional (not interface_stat.stat.exists): True 15627 1726882465.96292: handler run complete 15627 1726882465.96303: attempt loop complete, returning result 15627 1726882465.96306: _execute() done 15627 1726882465.96308: dumping result to json 15627 1726882465.96311: done dumping result, returning 15627 1726882465.96326: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' [0e448fcc-3ce9-2847-7723-00000000011a] 15627 1726882465.96339: sending task result for task 0e448fcc-3ce9-2847-7723-00000000011a ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15627 1726882465.96497: no more pending results, returning what we have 15627 1726882465.96500: results queue empty 15627 1726882465.96501: checking for any_errors_fatal 15627 1726882465.96506: done checking for any_errors_fatal 15627 1726882465.96507: checking for max_fail_percentage 15627 1726882465.96509: done checking for max_fail_percentage 15627 1726882465.96510: checking to see if all hosts have failed and the running result is not ok 15627 1726882465.96511: done checking to see if all hosts have failed 15627 1726882465.96512: getting the remaining hosts for this loop 15627 1726882465.96513: done getting the remaining hosts for this loop 15627 1726882465.96517: getting the next task for host managed_node1 15627 1726882465.96525: done getting next task for host managed_node1 15627 1726882465.96528: ^ task is: TASK: meta (flush_handlers) 15627 1726882465.96529: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882465.96533: getting variables 15627 1726882465.96534: in VariableManager get_vars() 15627 1726882465.96568: Calling all_inventory to load vars for managed_node1 15627 1726882465.96574: Calling groups_inventory to load vars for managed_node1 15627 1726882465.96579: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882465.96590: Calling all_plugins_play to load vars for managed_node1 15627 1726882465.96593: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882465.96596: Calling groups_plugins_play to load vars for managed_node1 15627 1726882465.96811: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000011a 15627 1726882465.96817: WORKER PROCESS EXITING 15627 1726882465.96859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882465.97086: done with get_vars() 15627 1726882465.97098: done getting variables 15627 1726882465.97177: in VariableManager get_vars() 15627 1726882465.97185: Calling all_inventory to load vars for managed_node1 15627 1726882465.97186: Calling groups_inventory to load vars for managed_node1 15627 1726882465.97191: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882465.97195: Calling all_plugins_play to load vars for managed_node1 15627 1726882465.97197: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882465.97199: Calling groups_plugins_play to load vars for managed_node1 15627 1726882465.97340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882465.97542: done with get_vars() 15627 1726882465.97556: done queuing things up, now waiting for results queue to drain 15627 1726882465.97558: results queue empty 15627 1726882465.97559: checking for any_errors_fatal 15627 1726882465.97561: done checking for any_errors_fatal 15627 1726882465.97562: checking for max_fail_percentage 15627 1726882465.97565: done checking for max_fail_percentage 15627 1726882465.97565: checking to see if all hosts have failed and the running result is not ok 15627 1726882465.97566: done checking to see if all hosts have failed 15627 1726882465.97572: getting the remaining hosts for this loop 15627 1726882465.97573: done getting the remaining hosts for this loop 15627 1726882465.97575: getting the next task for host managed_node1 15627 1726882465.97578: done getting next task for host managed_node1 15627 1726882465.97580: ^ task is: TASK: meta (flush_handlers) 15627 1726882465.97581: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882465.97583: getting variables 15627 1726882465.97584: in VariableManager get_vars() 15627 1726882465.97591: Calling all_inventory to load vars for managed_node1 15627 1726882465.97593: Calling groups_inventory to load vars for managed_node1 15627 1726882465.97595: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882465.97599: Calling all_plugins_play to load vars for managed_node1 15627 1726882465.97601: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882465.97604: Calling groups_plugins_play to load vars for managed_node1 15627 1726882465.97747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882465.97987: done with get_vars() 15627 1726882465.97994: done getting variables 15627 1726882465.98036: in VariableManager get_vars() 15627 1726882465.98044: Calling all_inventory to load vars for managed_node1 15627 1726882465.98046: Calling groups_inventory to load vars for managed_node1 15627 1726882465.98048: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882465.98055: Calling all_plugins_play to load vars for managed_node1 15627 1726882465.98061: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882465.98065: Calling groups_plugins_play to load vars for managed_node1 15627 1726882465.98216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882465.98420: done with get_vars() 15627 1726882465.98431: done queuing things up, now waiting for results queue to drain 15627 1726882465.98433: results queue empty 15627 1726882465.98434: checking for any_errors_fatal 15627 1726882465.98435: done checking for any_errors_fatal 15627 1726882465.98435: checking for max_fail_percentage 15627 1726882465.98436: done checking for max_fail_percentage 15627 1726882465.98437: checking to see if all hosts have failed and the running result is not ok 15627 1726882465.98438: done checking to see if all hosts have failed 15627 1726882465.98439: getting the remaining hosts for this loop 15627 1726882465.98440: done getting the remaining hosts for this loop 15627 1726882465.98442: getting the next task for host managed_node1 15627 1726882465.98444: done getting next task for host managed_node1 15627 1726882465.98445: ^ task is: None 15627 1726882465.98446: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882465.98447: done queuing things up, now waiting for results queue to drain 15627 1726882465.98448: results queue empty 15627 1726882465.98449: checking for any_errors_fatal 15627 1726882465.98449: done checking for any_errors_fatal 15627 1726882465.98450: checking for max_fail_percentage 15627 1726882465.98451: done checking for max_fail_percentage 15627 1726882465.98452: checking to see if all hosts have failed and the running result is not ok 15627 1726882465.98452: done checking to see if all hosts have failed 15627 1726882465.98454: getting the next task for host managed_node1 15627 1726882465.98457: done getting next task for host managed_node1 15627 1726882465.98457: ^ task is: None 15627 1726882465.98459: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882465.98509: in VariableManager get_vars() 15627 1726882465.98530: done with get_vars() 15627 1726882465.98535: in VariableManager get_vars() 15627 1726882465.98551: done with get_vars() 15627 1726882465.98556: variable 'omit' from source: magic vars 15627 1726882465.98588: in VariableManager get_vars() 15627 1726882465.98605: done with get_vars() 15627 1726882465.98628: variable 'omit' from source: magic vars PLAY [Add test bridge] ********************************************************* 15627 1726882466.00216: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15627 1726882466.00236: getting the remaining hosts for this loop 15627 1726882466.00237: done getting the remaining hosts for this loop 15627 1726882466.00240: getting the next task for host managed_node1 15627 1726882466.00242: done getting next task for host managed_node1 15627 1726882466.00243: ^ task is: TASK: Gathering Facts 15627 1726882466.00245: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882466.00246: getting variables 15627 1726882466.00247: in VariableManager get_vars() 15627 1726882466.00257: Calling all_inventory to load vars for managed_node1 15627 1726882466.00259: Calling groups_inventory to load vars for managed_node1 15627 1726882466.00261: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882466.00267: Calling all_plugins_play to load vars for managed_node1 15627 1726882466.00269: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882466.00272: Calling groups_plugins_play to load vars for managed_node1 15627 1726882466.00390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882466.00581: done with get_vars() 15627 1726882466.00595: done getting variables 15627 1726882466.00631: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Friday 20 September 2024 21:34:26 -0400 (0:00:00.068) 0:00:05.758 ****** 15627 1726882466.00651: entering _queue_task() for managed_node1/gather_facts 15627 1726882466.00880: worker is 1 (out of 1 available) 15627 1726882466.00893: exiting _queue_task() for managed_node1/gather_facts 15627 1726882466.00903: done queuing things up, now waiting for results queue to drain 15627 1726882466.00904: waiting for pending results... 15627 1726882466.01053: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15627 1726882466.01113: in run() - task 0e448fcc-3ce9-2847-7723-00000000014c 15627 1726882466.01126: variable 'ansible_search_path' from source: unknown 15627 1726882466.01155: calling self._execute() 15627 1726882466.01220: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882466.01224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882466.01232: variable 'omit' from source: magic vars 15627 1726882466.01501: variable 'ansible_distribution_major_version' from source: facts 15627 1726882466.01510: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882466.01516: variable 'omit' from source: magic vars 15627 1726882466.01535: variable 'omit' from source: magic vars 15627 1726882466.01560: variable 'omit' from source: magic vars 15627 1726882466.01592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882466.01618: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882466.01635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882466.01649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882466.01660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882466.01686: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882466.01689: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882466.01691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882466.01755: Set connection var ansible_timeout to 10 15627 1726882466.01766: Set connection var ansible_shell_executable to /bin/sh 15627 1726882466.01769: Set connection var ansible_connection to ssh 15627 1726882466.01775: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882466.01780: Set connection var ansible_pipelining to False 15627 1726882466.01784: Set connection var ansible_shell_type to sh 15627 1726882466.01801: variable 'ansible_shell_executable' from source: unknown 15627 1726882466.01804: variable 'ansible_connection' from source: unknown 15627 1726882466.01807: variable 'ansible_module_compression' from source: unknown 15627 1726882466.01809: variable 'ansible_shell_type' from source: unknown 15627 1726882466.01811: variable 'ansible_shell_executable' from source: unknown 15627 1726882466.01814: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882466.01816: variable 'ansible_pipelining' from source: unknown 15627 1726882466.01819: variable 'ansible_timeout' from source: unknown 15627 1726882466.01823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882466.01981: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882466.01996: variable 'omit' from source: magic vars 15627 1726882466.02011: starting attempt loop 15627 1726882466.02017: running the handler 15627 1726882466.02035: variable 'ansible_facts' from source: unknown 15627 1726882466.02095: _low_level_execute_command(): starting 15627 1726882466.02107: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882466.02996: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882466.03000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882466.03003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882466.03049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882466.03053: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15627 1726882466.03057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882466.03059: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.03111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882466.03137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882466.03155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882466.03265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882466.04911: stdout chunk (state=3): >>>/root <<< 15627 1726882466.05012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882466.05062: stderr chunk (state=3): >>><<< 15627 1726882466.05067: stdout chunk (state=3): >>><<< 15627 1726882466.05087: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882466.05099: _low_level_execute_command(): starting 15627 1726882466.05105: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882466.0508757-15933-231975574500054 `" && echo ansible-tmp-1726882466.0508757-15933-231975574500054="` echo /root/.ansible/tmp/ansible-tmp-1726882466.0508757-15933-231975574500054 `" ) && sleep 0' 15627 1726882466.05574: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882466.05588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882466.05601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.05621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.05661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882466.05679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882466.05790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882466.07667: stdout chunk (state=3): >>>ansible-tmp-1726882466.0508757-15933-231975574500054=/root/.ansible/tmp/ansible-tmp-1726882466.0508757-15933-231975574500054 <<< 15627 1726882466.07790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882466.07847: stderr chunk (state=3): >>><<< 15627 1726882466.07856: stdout chunk (state=3): >>><<< 15627 1726882466.07880: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882466.0508757-15933-231975574500054=/root/.ansible/tmp/ansible-tmp-1726882466.0508757-15933-231975574500054 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882466.07910: variable 'ansible_module_compression' from source: unknown 15627 1726882466.07971: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15627 1726882466.08026: variable 'ansible_facts' from source: unknown 15627 1726882466.08202: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882466.0508757-15933-231975574500054/AnsiballZ_setup.py 15627 1726882466.08318: Sending initial data 15627 1726882466.08327: Sent initial data (154 bytes) 15627 1726882466.08978: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882466.08981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882466.09011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.09015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882466.09019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.09068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882466.09081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882466.09185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882466.10884: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882466.10975: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882466.11073: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmplpuinl9m /root/.ansible/tmp/ansible-tmp-1726882466.0508757-15933-231975574500054/AnsiballZ_setup.py <<< 15627 1726882466.11168: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882466.13110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882466.13202: stderr chunk (state=3): >>><<< 15627 1726882466.13206: stdout chunk (state=3): >>><<< 15627 1726882466.13225: done transferring module to remote 15627 1726882466.13233: _low_level_execute_command(): starting 15627 1726882466.13236: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882466.0508757-15933-231975574500054/ /root/.ansible/tmp/ansible-tmp-1726882466.0508757-15933-231975574500054/AnsiballZ_setup.py && sleep 0' 15627 1726882466.13672: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882466.13676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882466.13708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.13711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882466.13713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.13756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882466.13773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882466.13871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882466.15587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882466.15630: stderr chunk (state=3): >>><<< 15627 1726882466.15633: stdout chunk (state=3): >>><<< 15627 1726882466.15647: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882466.15650: _low_level_execute_command(): starting 15627 1726882466.15657: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882466.0508757-15933-231975574500054/AnsiballZ_setup.py && sleep 0' 15627 1726882466.16064: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882466.16080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882466.16099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882466.16110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882466.16120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.16169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882466.16188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882466.16287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882466.66961: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.54, "5m": 0.38, "15m": 0.19}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "26", "epoch": "1726882466", "epoch_int": "1726882466", "date": "2024-09-20", "time": "21:34:26", "iso8601_micro": "2024-09-21T01:34:26.412504Z", "iso8601": "2024-09-21T01:34:26Z", "iso8601_basic": "20240920T213426412504", "iso8601_basic_short": "20240920T213426", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2789, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 743, "free": 2789}, "nocache": {"free": 3250, "used": 282}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 624, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241348608, "block_size": 4096, "block_total": 65519355, "block_available": 64512048, "block_used": 1007307, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15627 1726882466.68672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882466.68766: stderr chunk (state=3): >>><<< 15627 1726882466.68770: stdout chunk (state=3): >>><<< 15627 1726882466.68869: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.54, "5m": 0.38, "15m": 0.19}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "26", "epoch": "1726882466", "epoch_int": "1726882466", "date": "2024-09-20", "time": "21:34:26", "iso8601_micro": "2024-09-21T01:34:26.412504Z", "iso8601": "2024-09-21T01:34:26Z", "iso8601_basic": "20240920T213426412504", "iso8601_basic_short": "20240920T213426", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2789, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 743, "free": 2789}, "nocache": {"free": 3250, "used": 282}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 624, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241348608, "block_size": 4096, "block_total": 65519355, "block_available": 64512048, "block_used": 1007307, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882466.69187: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882466.0508757-15933-231975574500054/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882466.69213: _low_level_execute_command(): starting 15627 1726882466.69223: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882466.0508757-15933-231975574500054/ > /dev/null 2>&1 && sleep 0' 15627 1726882466.69952: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882466.69973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882466.69990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882466.70009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882466.70057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882466.70074: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882466.70090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.70109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882466.70122: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882466.70150: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882466.70171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882466.70186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882466.70202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882466.70221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882466.70233: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882466.70253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.70352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882466.70389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882466.70413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882466.70565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882466.72461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882466.72488: stderr chunk (state=3): >>><<< 15627 1726882466.72492: stdout chunk (state=3): >>><<< 15627 1726882466.72515: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882466.72522: handler run complete 15627 1726882466.72598: variable 'ansible_facts' from source: unknown 15627 1726882466.72676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882466.72961: variable 'ansible_facts' from source: unknown 15627 1726882466.73018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882466.73143: attempt loop complete, returning result 15627 1726882466.73158: _execute() done 15627 1726882466.73171: dumping result to json 15627 1726882466.73199: done dumping result, returning 15627 1726882466.73207: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-2847-7723-00000000014c] 15627 1726882466.73213: sending task result for task 0e448fcc-3ce9-2847-7723-00000000014c 15627 1726882466.73571: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000014c 15627 1726882466.73574: WORKER PROCESS EXITING ok: [managed_node1] 15627 1726882466.73803: no more pending results, returning what we have 15627 1726882466.73806: results queue empty 15627 1726882466.73807: checking for any_errors_fatal 15627 1726882466.73808: done checking for any_errors_fatal 15627 1726882466.73809: checking for max_fail_percentage 15627 1726882466.73810: done checking for max_fail_percentage 15627 1726882466.73811: checking to see if all hosts have failed and the running result is not ok 15627 1726882466.73812: done checking to see if all hosts have failed 15627 1726882466.73812: getting the remaining hosts for this loop 15627 1726882466.73814: done getting the remaining hosts for this loop 15627 1726882466.73816: getting the next task for host managed_node1 15627 1726882466.73821: done getting next task for host managed_node1 15627 1726882466.73823: ^ task is: TASK: meta (flush_handlers) 15627 1726882466.73825: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882466.73838: getting variables 15627 1726882466.73839: in VariableManager get_vars() 15627 1726882466.73900: Calling all_inventory to load vars for managed_node1 15627 1726882466.73904: Calling groups_inventory to load vars for managed_node1 15627 1726882466.73906: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882466.73916: Calling all_plugins_play to load vars for managed_node1 15627 1726882466.73919: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882466.73922: Calling groups_plugins_play to load vars for managed_node1 15627 1726882466.74097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882466.74305: done with get_vars() 15627 1726882466.74316: done getting variables 15627 1726882466.74387: in VariableManager get_vars() 15627 1726882466.74399: Calling all_inventory to load vars for managed_node1 15627 1726882466.74401: Calling groups_inventory to load vars for managed_node1 15627 1726882466.74403: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882466.74407: Calling all_plugins_play to load vars for managed_node1 15627 1726882466.74409: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882466.74415: Calling groups_plugins_play to load vars for managed_node1 15627 1726882466.74578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882466.74798: done with get_vars() 15627 1726882466.74823: done queuing things up, now waiting for results queue to drain 15627 1726882466.74838: results queue empty 15627 1726882466.74842: checking for any_errors_fatal 15627 1726882466.74845: done checking for any_errors_fatal 15627 1726882466.74846: checking for max_fail_percentage 15627 1726882466.74847: done checking for max_fail_percentage 15627 1726882466.74848: checking to see if all hosts have failed and the running result is not ok 15627 1726882466.74848: done checking to see if all hosts have failed 15627 1726882466.74849: getting the remaining hosts for this loop 15627 1726882466.74850: done getting the remaining hosts for this loop 15627 1726882466.74853: getting the next task for host managed_node1 15627 1726882466.74859: done getting next task for host managed_node1 15627 1726882466.74870: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15627 1726882466.74873: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882466.74882: getting variables 15627 1726882466.74884: in VariableManager get_vars() 15627 1726882466.74896: Calling all_inventory to load vars for managed_node1 15627 1726882466.74898: Calling groups_inventory to load vars for managed_node1 15627 1726882466.74900: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882466.74911: Calling all_plugins_play to load vars for managed_node1 15627 1726882466.74919: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882466.74928: Calling groups_plugins_play to load vars for managed_node1 15627 1726882466.75126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882466.75293: done with get_vars() 15627 1726882466.75305: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:34:26 -0400 (0:00:00.747) 0:00:06.505 ****** 15627 1726882466.75361: entering _queue_task() for managed_node1/include_tasks 15627 1726882466.75582: worker is 1 (out of 1 available) 15627 1726882466.75595: exiting _queue_task() for managed_node1/include_tasks 15627 1726882466.75605: done queuing things up, now waiting for results queue to drain 15627 1726882466.75607: waiting for pending results... 15627 1726882466.75756: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15627 1726882466.75841: in run() - task 0e448fcc-3ce9-2847-7723-000000000014 15627 1726882466.75852: variable 'ansible_search_path' from source: unknown 15627 1726882466.75855: variable 'ansible_search_path' from source: unknown 15627 1726882466.75900: calling self._execute() 15627 1726882466.75986: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882466.75990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882466.76006: variable 'omit' from source: magic vars 15627 1726882466.76405: variable 'ansible_distribution_major_version' from source: facts 15627 1726882466.76414: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882466.76419: _execute() done 15627 1726882466.76422: dumping result to json 15627 1726882466.76424: done dumping result, returning 15627 1726882466.76434: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-2847-7723-000000000014] 15627 1726882466.76436: sending task result for task 0e448fcc-3ce9-2847-7723-000000000014 15627 1726882466.76518: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000014 15627 1726882466.76520: WORKER PROCESS EXITING 15627 1726882466.76621: no more pending results, returning what we have 15627 1726882466.76625: in VariableManager get_vars() 15627 1726882466.76679: Calling all_inventory to load vars for managed_node1 15627 1726882466.76682: Calling groups_inventory to load vars for managed_node1 15627 1726882466.76685: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882466.76691: Calling all_plugins_play to load vars for managed_node1 15627 1726882466.76693: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882466.76695: Calling groups_plugins_play to load vars for managed_node1 15627 1726882466.76902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882466.77043: done with get_vars() 15627 1726882466.77048: variable 'ansible_search_path' from source: unknown 15627 1726882466.77049: variable 'ansible_search_path' from source: unknown 15627 1726882466.77081: we have included files to process 15627 1726882466.77083: generating all_blocks data 15627 1726882466.77084: done generating all_blocks data 15627 1726882466.77085: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15627 1726882466.77085: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15627 1726882466.77087: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15627 1726882466.77899: done processing included file 15627 1726882466.77904: iterating over new_blocks loaded from include file 15627 1726882466.77905: in VariableManager get_vars() 15627 1726882466.77932: done with get_vars() 15627 1726882466.77937: filtering new block on tags 15627 1726882466.77959: done filtering new block on tags 15627 1726882466.77962: in VariableManager get_vars() 15627 1726882466.77983: done with get_vars() 15627 1726882466.77985: filtering new block on tags 15627 1726882466.78004: done filtering new block on tags 15627 1726882466.78006: in VariableManager get_vars() 15627 1726882466.78024: done with get_vars() 15627 1726882466.78026: filtering new block on tags 15627 1726882466.78041: done filtering new block on tags 15627 1726882466.78043: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 15627 1726882466.78048: extending task lists for all hosts with included blocks 15627 1726882466.78481: done extending task lists 15627 1726882466.78482: done processing included files 15627 1726882466.78483: results queue empty 15627 1726882466.78483: checking for any_errors_fatal 15627 1726882466.78485: done checking for any_errors_fatal 15627 1726882466.78485: checking for max_fail_percentage 15627 1726882466.78486: done checking for max_fail_percentage 15627 1726882466.78487: checking to see if all hosts have failed and the running result is not ok 15627 1726882466.78488: done checking to see if all hosts have failed 15627 1726882466.78489: getting the remaining hosts for this loop 15627 1726882466.78490: done getting the remaining hosts for this loop 15627 1726882466.78492: getting the next task for host managed_node1 15627 1726882466.78496: done getting next task for host managed_node1 15627 1726882466.78498: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15627 1726882466.78500: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882466.78509: getting variables 15627 1726882466.78509: in VariableManager get_vars() 15627 1726882466.78521: Calling all_inventory to load vars for managed_node1 15627 1726882466.78523: Calling groups_inventory to load vars for managed_node1 15627 1726882466.78525: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882466.78529: Calling all_plugins_play to load vars for managed_node1 15627 1726882466.78531: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882466.78534: Calling groups_plugins_play to load vars for managed_node1 15627 1726882466.78727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882466.78938: done with get_vars() 15627 1726882466.78956: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:34:26 -0400 (0:00:00.037) 0:00:06.542 ****** 15627 1726882466.79070: entering _queue_task() for managed_node1/setup 15627 1726882466.79399: worker is 1 (out of 1 available) 15627 1726882466.79412: exiting _queue_task() for managed_node1/setup 15627 1726882466.79431: done queuing things up, now waiting for results queue to drain 15627 1726882466.79432: waiting for pending results... 15627 1726882466.79709: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15627 1726882466.79830: in run() - task 0e448fcc-3ce9-2847-7723-00000000018d 15627 1726882466.79848: variable 'ansible_search_path' from source: unknown 15627 1726882466.79857: variable 'ansible_search_path' from source: unknown 15627 1726882466.79900: calling self._execute() 15627 1726882466.79984: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882466.79999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882466.80013: variable 'omit' from source: magic vars 15627 1726882466.80370: variable 'ansible_distribution_major_version' from source: facts 15627 1726882466.80386: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882466.80633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882466.82769: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882466.82819: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882466.82848: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882466.82875: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882466.82896: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882466.82952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882466.82978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882466.83025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882466.83050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882466.83067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882466.83128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882466.83144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882466.83163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882466.83220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882466.83231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882466.83360: variable '__network_required_facts' from source: role '' defaults 15627 1726882466.83366: variable 'ansible_facts' from source: unknown 15627 1726882466.83456: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15627 1726882466.83467: when evaluation is False, skipping this task 15627 1726882466.83474: _execute() done 15627 1726882466.83480: dumping result to json 15627 1726882466.83486: done dumping result, returning 15627 1726882466.83497: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-2847-7723-00000000018d] 15627 1726882466.83505: sending task result for task 0e448fcc-3ce9-2847-7723-00000000018d 15627 1726882466.83611: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000018d 15627 1726882466.83618: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882466.83921: no more pending results, returning what we have 15627 1726882466.83925: results queue empty 15627 1726882466.83926: checking for any_errors_fatal 15627 1726882466.83927: done checking for any_errors_fatal 15627 1726882466.83928: checking for max_fail_percentage 15627 1726882466.83929: done checking for max_fail_percentage 15627 1726882466.83930: checking to see if all hosts have failed and the running result is not ok 15627 1726882466.83931: done checking to see if all hosts have failed 15627 1726882466.83932: getting the remaining hosts for this loop 15627 1726882466.83933: done getting the remaining hosts for this loop 15627 1726882466.83936: getting the next task for host managed_node1 15627 1726882466.83944: done getting next task for host managed_node1 15627 1726882466.83947: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15627 1726882466.83950: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882466.83963: getting variables 15627 1726882466.83966: in VariableManager get_vars() 15627 1726882466.84005: Calling all_inventory to load vars for managed_node1 15627 1726882466.84008: Calling groups_inventory to load vars for managed_node1 15627 1726882466.84011: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882466.84020: Calling all_plugins_play to load vars for managed_node1 15627 1726882466.84023: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882466.84025: Calling groups_plugins_play to load vars for managed_node1 15627 1726882466.84249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882466.84487: done with get_vars() 15627 1726882466.84496: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:34:26 -0400 (0:00:00.058) 0:00:06.600 ****** 15627 1726882466.84874: entering _queue_task() for managed_node1/stat 15627 1726882466.85108: worker is 1 (out of 1 available) 15627 1726882466.85119: exiting _queue_task() for managed_node1/stat 15627 1726882466.85129: done queuing things up, now waiting for results queue to drain 15627 1726882466.85130: waiting for pending results... 15627 1726882466.85921: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15627 1726882466.86201: in run() - task 0e448fcc-3ce9-2847-7723-00000000018f 15627 1726882466.86349: variable 'ansible_search_path' from source: unknown 15627 1726882466.86357: variable 'ansible_search_path' from source: unknown 15627 1726882466.86399: calling self._execute() 15627 1726882466.86551: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882466.86608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882466.86625: variable 'omit' from source: magic vars 15627 1726882466.87012: variable 'ansible_distribution_major_version' from source: facts 15627 1726882466.87030: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882466.87197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882466.87472: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882466.87520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882466.87565: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882466.87604: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882466.87691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882466.87746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882466.87785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882466.87816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882466.87909: variable '__network_is_ostree' from source: set_fact 15627 1726882466.87921: Evaluated conditional (not __network_is_ostree is defined): False 15627 1726882466.87928: when evaluation is False, skipping this task 15627 1726882466.87935: _execute() done 15627 1726882466.87942: dumping result to json 15627 1726882466.87949: done dumping result, returning 15627 1726882466.87960: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-2847-7723-00000000018f] 15627 1726882466.87972: sending task result for task 0e448fcc-3ce9-2847-7723-00000000018f 15627 1726882466.88080: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000018f skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15627 1726882466.88134: no more pending results, returning what we have 15627 1726882466.88138: results queue empty 15627 1726882466.88139: checking for any_errors_fatal 15627 1726882466.88143: done checking for any_errors_fatal 15627 1726882466.88144: checking for max_fail_percentage 15627 1726882466.88145: done checking for max_fail_percentage 15627 1726882466.88146: checking to see if all hosts have failed and the running result is not ok 15627 1726882466.88148: done checking to see if all hosts have failed 15627 1726882466.88149: getting the remaining hosts for this loop 15627 1726882466.88150: done getting the remaining hosts for this loop 15627 1726882466.88154: getting the next task for host managed_node1 15627 1726882466.88162: done getting next task for host managed_node1 15627 1726882466.88168: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15627 1726882466.88171: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882466.88183: getting variables 15627 1726882466.88185: in VariableManager get_vars() 15627 1726882466.88222: Calling all_inventory to load vars for managed_node1 15627 1726882466.88226: Calling groups_inventory to load vars for managed_node1 15627 1726882466.88228: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882466.88238: Calling all_plugins_play to load vars for managed_node1 15627 1726882466.88241: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882466.88245: Calling groups_plugins_play to load vars for managed_node1 15627 1726882466.88635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882466.88846: done with get_vars() 15627 1726882466.88855: done getting variables 15627 1726882466.88919: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:34:26 -0400 (0:00:00.040) 0:00:06.641 ****** 15627 1726882466.88958: entering _queue_task() for managed_node1/set_fact 15627 1726882466.89246: WORKER PROCESS EXITING 15627 1726882466.89458: worker is 1 (out of 1 available) 15627 1726882466.89472: exiting _queue_task() for managed_node1/set_fact 15627 1726882466.89483: done queuing things up, now waiting for results queue to drain 15627 1726882466.89485: waiting for pending results... 15627 1726882466.89716: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15627 1726882466.89819: in run() - task 0e448fcc-3ce9-2847-7723-000000000190 15627 1726882466.90168: variable 'ansible_search_path' from source: unknown 15627 1726882466.90177: variable 'ansible_search_path' from source: unknown 15627 1726882466.90212: calling self._execute() 15627 1726882466.90305: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882466.90316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882466.90330: variable 'omit' from source: magic vars 15627 1726882466.90918: variable 'ansible_distribution_major_version' from source: facts 15627 1726882466.90945: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882466.91122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882466.91416: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882466.91466: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882466.91691: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882466.91735: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882466.91820: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882466.91850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882466.91883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882466.91913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882466.92006: variable '__network_is_ostree' from source: set_fact 15627 1726882466.92018: Evaluated conditional (not __network_is_ostree is defined): False 15627 1726882466.92026: when evaluation is False, skipping this task 15627 1726882466.92033: _execute() done 15627 1726882466.92041: dumping result to json 15627 1726882466.92050: done dumping result, returning 15627 1726882466.92060: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-2847-7723-000000000190] 15627 1726882466.92072: sending task result for task 0e448fcc-3ce9-2847-7723-000000000190 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15627 1726882466.92206: no more pending results, returning what we have 15627 1726882466.92210: results queue empty 15627 1726882466.92211: checking for any_errors_fatal 15627 1726882466.92217: done checking for any_errors_fatal 15627 1726882466.92218: checking for max_fail_percentage 15627 1726882466.92220: done checking for max_fail_percentage 15627 1726882466.92221: checking to see if all hosts have failed and the running result is not ok 15627 1726882466.92222: done checking to see if all hosts have failed 15627 1726882466.92223: getting the remaining hosts for this loop 15627 1726882466.92225: done getting the remaining hosts for this loop 15627 1726882466.92228: getting the next task for host managed_node1 15627 1726882466.92239: done getting next task for host managed_node1 15627 1726882466.92243: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15627 1726882466.92246: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882466.92257: getting variables 15627 1726882466.92259: in VariableManager get_vars() 15627 1726882466.92299: Calling all_inventory to load vars for managed_node1 15627 1726882466.92302: Calling groups_inventory to load vars for managed_node1 15627 1726882466.92304: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882466.92314: Calling all_plugins_play to load vars for managed_node1 15627 1726882466.92317: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882466.92321: Calling groups_plugins_play to load vars for managed_node1 15627 1726882466.92533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882466.92740: done with get_vars() 15627 1726882466.92751: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:34:26 -0400 (0:00:00.038) 0:00:06.680 ****** 15627 1726882466.92849: entering _queue_task() for managed_node1/service_facts 15627 1726882466.92852: Creating lock for service_facts 15627 1726882466.92895: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000190 15627 1726882466.92904: WORKER PROCESS EXITING 15627 1726882466.93317: worker is 1 (out of 1 available) 15627 1726882466.93330: exiting _queue_task() for managed_node1/service_facts 15627 1726882466.93340: done queuing things up, now waiting for results queue to drain 15627 1726882466.93341: waiting for pending results... 15627 1726882466.93582: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 15627 1726882466.93693: in run() - task 0e448fcc-3ce9-2847-7723-000000000192 15627 1726882466.93711: variable 'ansible_search_path' from source: unknown 15627 1726882466.93718: variable 'ansible_search_path' from source: unknown 15627 1726882466.93755: calling self._execute() 15627 1726882466.93838: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882466.93850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882466.93865: variable 'omit' from source: magic vars 15627 1726882466.94209: variable 'ansible_distribution_major_version' from source: facts 15627 1726882466.94226: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882466.94235: variable 'omit' from source: magic vars 15627 1726882466.94287: variable 'omit' from source: magic vars 15627 1726882466.94326: variable 'omit' from source: magic vars 15627 1726882466.94373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882466.94413: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882466.94444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882466.94470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882466.94487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882466.94522: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882466.94531: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882466.94542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882466.94645: Set connection var ansible_timeout to 10 15627 1726882466.94665: Set connection var ansible_shell_executable to /bin/sh 15627 1726882466.94676: Set connection var ansible_connection to ssh 15627 1726882466.94685: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882466.94694: Set connection var ansible_pipelining to False 15627 1726882466.94701: Set connection var ansible_shell_type to sh 15627 1726882466.94727: variable 'ansible_shell_executable' from source: unknown 15627 1726882466.94736: variable 'ansible_connection' from source: unknown 15627 1726882466.94744: variable 'ansible_module_compression' from source: unknown 15627 1726882466.94751: variable 'ansible_shell_type' from source: unknown 15627 1726882466.94762: variable 'ansible_shell_executable' from source: unknown 15627 1726882466.94772: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882466.94780: variable 'ansible_pipelining' from source: unknown 15627 1726882466.94787: variable 'ansible_timeout' from source: unknown 15627 1726882466.94794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882466.94997: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882466.95013: variable 'omit' from source: magic vars 15627 1726882466.95022: starting attempt loop 15627 1726882466.95029: running the handler 15627 1726882466.95046: _low_level_execute_command(): starting 15627 1726882466.95058: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882466.95834: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882466.95853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882466.95871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882466.95890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882466.95934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882466.95946: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882466.95963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.95985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882466.95999: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882466.96011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882466.96024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882466.96037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882466.96052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882466.96067: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882466.96080: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882466.96093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.96172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882466.96200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882466.96218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882466.96348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882466.98014: stdout chunk (state=3): >>>/root <<< 15627 1726882466.98202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882466.98206: stdout chunk (state=3): >>><<< 15627 1726882466.98209: stderr chunk (state=3): >>><<< 15627 1726882466.98271: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882466.98274: _low_level_execute_command(): starting 15627 1726882466.98277: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882466.9822695-15988-10878904948133 `" && echo ansible-tmp-1726882466.9822695-15988-10878904948133="` echo /root/.ansible/tmp/ansible-tmp-1726882466.9822695-15988-10878904948133 `" ) && sleep 0' 15627 1726882466.98920: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882466.98933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882466.98952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882466.98976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882466.99019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882466.99030: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882466.99046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.99067: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882466.99080: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882466.99090: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882466.99101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882466.99112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882466.99127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882466.99138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882466.99148: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882466.99166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882466.99242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882466.99267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882466.99286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882466.99410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882467.01280: stdout chunk (state=3): >>>ansible-tmp-1726882466.9822695-15988-10878904948133=/root/.ansible/tmp/ansible-tmp-1726882466.9822695-15988-10878904948133 <<< 15627 1726882467.01468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882467.01472: stdout chunk (state=3): >>><<< 15627 1726882467.01474: stderr chunk (state=3): >>><<< 15627 1726882467.01871: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882466.9822695-15988-10878904948133=/root/.ansible/tmp/ansible-tmp-1726882466.9822695-15988-10878904948133 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882467.01874: variable 'ansible_module_compression' from source: unknown 15627 1726882467.01877: ANSIBALLZ: Using lock for service_facts 15627 1726882467.01879: ANSIBALLZ: Acquiring lock 15627 1726882467.01881: ANSIBALLZ: Lock acquired: 140251851983376 15627 1726882467.01883: ANSIBALLZ: Creating module 15627 1726882467.14194: ANSIBALLZ: Writing module into payload 15627 1726882467.14315: ANSIBALLZ: Writing module 15627 1726882467.14346: ANSIBALLZ: Renaming module 15627 1726882467.14365: ANSIBALLZ: Done creating module 15627 1726882467.14387: variable 'ansible_facts' from source: unknown 15627 1726882467.14465: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882466.9822695-15988-10878904948133/AnsiballZ_service_facts.py 15627 1726882467.14611: Sending initial data 15627 1726882467.14614: Sent initial data (161 bytes) 15627 1726882467.15541: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882467.15556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882467.15573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882467.15591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882467.15631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882467.15642: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882467.15655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882467.15678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882467.15691: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882467.15702: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882467.15713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882467.15725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882467.15740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882467.15751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882467.15761: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882467.15782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882467.15857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882467.15875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882467.15890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882467.16587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882467.18398: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882467.18490: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882467.18590: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpq83w9jvl /root/.ansible/tmp/ansible-tmp-1726882466.9822695-15988-10878904948133/AnsiballZ_service_facts.py <<< 15627 1726882467.18686: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882467.20724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882467.20770: stderr chunk (state=3): >>><<< 15627 1726882467.20773: stdout chunk (state=3): >>><<< 15627 1726882467.20775: done transferring module to remote 15627 1726882467.20854: _low_level_execute_command(): starting 15627 1726882467.20858: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882466.9822695-15988-10878904948133/ /root/.ansible/tmp/ansible-tmp-1726882466.9822695-15988-10878904948133/AnsiballZ_service_facts.py && sleep 0' 15627 1726882467.21807: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882467.22781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882467.22797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882467.22815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882467.22862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882467.22878: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882467.22893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882467.22911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882467.22923: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882467.22935: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882467.22946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882467.22960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882467.22979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882467.22992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882467.23004: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882467.23017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882467.23095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882467.23112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882467.23126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882467.23250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882467.25090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882467.25096: stdout chunk (state=3): >>><<< 15627 1726882467.25098: stderr chunk (state=3): >>><<< 15627 1726882467.25193: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882467.25196: _low_level_execute_command(): starting 15627 1726882467.25200: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882466.9822695-15988-10878904948133/AnsiballZ_service_facts.py && sleep 0' 15627 1726882467.26938: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882467.26942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882467.26975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882467.26978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15627 1726882467.26980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882467.27100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882467.27104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882467.27162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882467.27177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882467.27314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882468.60484: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 15627 1726882468.60500: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 15627 1726882468.60535: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", <<< 15627 1726882468.60549: stdout chunk (state=3): >>>"status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "sys<<< 15627 1726882468.60558: stdout chunk (state=3): >>>temd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15627 1726882468.61775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882468.61829: stderr chunk (state=3): >>><<< 15627 1726882468.61832: stdout chunk (state=3): >>><<< 15627 1726882468.61852: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882468.63151: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882466.9822695-15988-10878904948133/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882468.63168: _low_level_execute_command(): starting 15627 1726882468.63177: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882466.9822695-15988-10878904948133/ > /dev/null 2>&1 && sleep 0' 15627 1726882468.63757: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882468.63772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882468.63785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882468.63799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882468.63837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882468.63846: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882468.63858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882468.63879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882468.63889: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882468.63898: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882468.63907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882468.63917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882468.63928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882468.63937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882468.63945: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882468.63955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882468.64033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882468.64036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882468.64042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882468.64134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882468.65951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882468.66000: stderr chunk (state=3): >>><<< 15627 1726882468.66003: stdout chunk (state=3): >>><<< 15627 1726882468.66017: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882468.66022: handler run complete 15627 1726882468.66169: variable 'ansible_facts' from source: unknown 15627 1726882468.66196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882468.66438: variable 'ansible_facts' from source: unknown 15627 1726882468.66507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882468.66611: attempt loop complete, returning result 15627 1726882468.66615: _execute() done 15627 1726882468.66617: dumping result to json 15627 1726882468.66649: done dumping result, returning 15627 1726882468.66663: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-2847-7723-000000000192] 15627 1726882468.66668: sending task result for task 0e448fcc-3ce9-2847-7723-000000000192 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882468.67175: no more pending results, returning what we have 15627 1726882468.67178: results queue empty 15627 1726882468.67179: checking for any_errors_fatal 15627 1726882468.67182: done checking for any_errors_fatal 15627 1726882468.67183: checking for max_fail_percentage 15627 1726882468.67184: done checking for max_fail_percentage 15627 1726882468.67185: checking to see if all hosts have failed and the running result is not ok 15627 1726882468.67186: done checking to see if all hosts have failed 15627 1726882468.67187: getting the remaining hosts for this loop 15627 1726882468.67188: done getting the remaining hosts for this loop 15627 1726882468.67191: getting the next task for host managed_node1 15627 1726882468.67196: done getting next task for host managed_node1 15627 1726882468.67200: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15627 1726882468.67202: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882468.67210: getting variables 15627 1726882468.67211: in VariableManager get_vars() 15627 1726882468.67240: Calling all_inventory to load vars for managed_node1 15627 1726882468.67243: Calling groups_inventory to load vars for managed_node1 15627 1726882468.67245: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882468.67253: Calling all_plugins_play to load vars for managed_node1 15627 1726882468.67257: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882468.67260: Calling groups_plugins_play to load vars for managed_node1 15627 1726882468.67270: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000192 15627 1726882468.67273: WORKER PROCESS EXITING 15627 1726882468.67596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882468.67844: done with get_vars() 15627 1726882468.67858: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:34:28 -0400 (0:00:01.750) 0:00:08.430 ****** 15627 1726882468.67924: entering _queue_task() for managed_node1/package_facts 15627 1726882468.67925: Creating lock for package_facts 15627 1726882468.68118: worker is 1 (out of 1 available) 15627 1726882468.68130: exiting _queue_task() for managed_node1/package_facts 15627 1726882468.68140: done queuing things up, now waiting for results queue to drain 15627 1726882468.68142: waiting for pending results... 15627 1726882468.68306: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15627 1726882468.68384: in run() - task 0e448fcc-3ce9-2847-7723-000000000193 15627 1726882468.68396: variable 'ansible_search_path' from source: unknown 15627 1726882468.68399: variable 'ansible_search_path' from source: unknown 15627 1726882468.68426: calling self._execute() 15627 1726882468.68486: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882468.68491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882468.68499: variable 'omit' from source: magic vars 15627 1726882468.68766: variable 'ansible_distribution_major_version' from source: facts 15627 1726882468.68775: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882468.68781: variable 'omit' from source: magic vars 15627 1726882468.68817: variable 'omit' from source: magic vars 15627 1726882468.68840: variable 'omit' from source: magic vars 15627 1726882468.68882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882468.68908: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882468.68924: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882468.68937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882468.68947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882468.68972: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882468.68976: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882468.68979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882468.69043: Set connection var ansible_timeout to 10 15627 1726882468.69051: Set connection var ansible_shell_executable to /bin/sh 15627 1726882468.69058: Set connection var ansible_connection to ssh 15627 1726882468.69060: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882468.69067: Set connection var ansible_pipelining to False 15627 1726882468.69069: Set connection var ansible_shell_type to sh 15627 1726882468.69086: variable 'ansible_shell_executable' from source: unknown 15627 1726882468.69089: variable 'ansible_connection' from source: unknown 15627 1726882468.69092: variable 'ansible_module_compression' from source: unknown 15627 1726882468.69095: variable 'ansible_shell_type' from source: unknown 15627 1726882468.69097: variable 'ansible_shell_executable' from source: unknown 15627 1726882468.69099: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882468.69101: variable 'ansible_pipelining' from source: unknown 15627 1726882468.69103: variable 'ansible_timeout' from source: unknown 15627 1726882468.69108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882468.69244: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882468.69252: variable 'omit' from source: magic vars 15627 1726882468.69259: starting attempt loop 15627 1726882468.69261: running the handler 15627 1726882468.69271: _low_level_execute_command(): starting 15627 1726882468.69278: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882468.69773: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882468.69794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882468.69812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882468.69823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882468.69867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882468.69880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882468.69993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882468.71617: stdout chunk (state=3): >>>/root <<< 15627 1726882468.71717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882468.71767: stderr chunk (state=3): >>><<< 15627 1726882468.71770: stdout chunk (state=3): >>><<< 15627 1726882468.71788: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882468.71800: _low_level_execute_command(): starting 15627 1726882468.71805: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882468.7178855-16072-42142270815981 `" && echo ansible-tmp-1726882468.7178855-16072-42142270815981="` echo /root/.ansible/tmp/ansible-tmp-1726882468.7178855-16072-42142270815981 `" ) && sleep 0' 15627 1726882468.72230: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882468.72242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882468.72267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882468.72278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882468.72324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882468.72342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882468.72436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882468.74302: stdout chunk (state=3): >>>ansible-tmp-1726882468.7178855-16072-42142270815981=/root/.ansible/tmp/ansible-tmp-1726882468.7178855-16072-42142270815981 <<< 15627 1726882468.74407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882468.74451: stderr chunk (state=3): >>><<< 15627 1726882468.74456: stdout chunk (state=3): >>><<< 15627 1726882468.74469: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882468.7178855-16072-42142270815981=/root/.ansible/tmp/ansible-tmp-1726882468.7178855-16072-42142270815981 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882468.74505: variable 'ansible_module_compression' from source: unknown 15627 1726882468.74539: ANSIBALLZ: Using lock for package_facts 15627 1726882468.74542: ANSIBALLZ: Acquiring lock 15627 1726882468.74544: ANSIBALLZ: Lock acquired: 140251854461008 15627 1726882468.74547: ANSIBALLZ: Creating module 15627 1726882468.94012: ANSIBALLZ: Writing module into payload 15627 1726882468.94127: ANSIBALLZ: Writing module 15627 1726882468.94155: ANSIBALLZ: Renaming module 15627 1726882468.94163: ANSIBALLZ: Done creating module 15627 1726882468.94193: variable 'ansible_facts' from source: unknown 15627 1726882468.94371: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882468.7178855-16072-42142270815981/AnsiballZ_package_facts.py 15627 1726882468.94527: Sending initial data 15627 1726882468.94530: Sent initial data (161 bytes) 15627 1726882468.95375: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882468.95378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882468.95417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882468.95421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882468.95423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882468.95425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882468.95472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882468.95476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882468.95586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882468.97420: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882468.97510: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882468.97601: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmp_7fhua7q /root/.ansible/tmp/ansible-tmp-1726882468.7178855-16072-42142270815981/AnsiballZ_package_facts.py <<< 15627 1726882468.97691: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882468.99639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882468.99739: stderr chunk (state=3): >>><<< 15627 1726882468.99743: stdout chunk (state=3): >>><<< 15627 1726882468.99760: done transferring module to remote 15627 1726882468.99781: _low_level_execute_command(): starting 15627 1726882468.99784: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882468.7178855-16072-42142270815981/ /root/.ansible/tmp/ansible-tmp-1726882468.7178855-16072-42142270815981/AnsiballZ_package_facts.py && sleep 0' 15627 1726882469.00224: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882469.00227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882469.00268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882469.00271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882469.00273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882469.00277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882469.00322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882469.00326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882469.00422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882469.02253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882469.02258: stdout chunk (state=3): >>><<< 15627 1726882469.02260: stderr chunk (state=3): >>><<< 15627 1726882469.02340: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882469.02346: _low_level_execute_command(): starting 15627 1726882469.02348: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882468.7178855-16072-42142270815981/AnsiballZ_package_facts.py && sleep 0' 15627 1726882469.02761: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882469.02765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882469.02801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882469.02804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882469.02806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882469.02852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882469.02858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882469.02961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882469.49452: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 15627 1726882469.49503: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 15627 1726882469.49524: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15627 1726882469.51153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882469.51158: stdout chunk (state=3): >>><<< 15627 1726882469.51160: stderr chunk (state=3): >>><<< 15627 1726882469.51362: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882469.55712: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882468.7178855-16072-42142270815981/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882469.55743: _low_level_execute_command(): starting 15627 1726882469.55758: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882468.7178855-16072-42142270815981/ > /dev/null 2>&1 && sleep 0' 15627 1726882469.57661: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882469.57666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882469.57710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882469.57732: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882469.57852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882469.57915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882469.58051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882469.58065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882469.58268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882469.60180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882469.60183: stdout chunk (state=3): >>><<< 15627 1726882469.60190: stderr chunk (state=3): >>><<< 15627 1726882469.60207: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882469.60214: handler run complete 15627 1726882469.61634: variable 'ansible_facts' from source: unknown 15627 1726882469.62536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882469.67194: variable 'ansible_facts' from source: unknown 15627 1726882469.68374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882469.70013: attempt loop complete, returning result 15627 1726882469.70025: _execute() done 15627 1726882469.70028: dumping result to json 15627 1726882469.70573: done dumping result, returning 15627 1726882469.70584: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-2847-7723-000000000193] 15627 1726882469.70587: sending task result for task 0e448fcc-3ce9-2847-7723-000000000193 15627 1726882469.74533: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000193 15627 1726882469.74537: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882469.74638: no more pending results, returning what we have 15627 1726882469.74640: results queue empty 15627 1726882469.74641: checking for any_errors_fatal 15627 1726882469.74646: done checking for any_errors_fatal 15627 1726882469.74647: checking for max_fail_percentage 15627 1726882469.74648: done checking for max_fail_percentage 15627 1726882469.74649: checking to see if all hosts have failed and the running result is not ok 15627 1726882469.74650: done checking to see if all hosts have failed 15627 1726882469.74651: getting the remaining hosts for this loop 15627 1726882469.74652: done getting the remaining hosts for this loop 15627 1726882469.74655: getting the next task for host managed_node1 15627 1726882469.74667: done getting next task for host managed_node1 15627 1726882469.74671: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15627 1726882469.74673: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882469.74683: getting variables 15627 1726882469.74684: in VariableManager get_vars() 15627 1726882469.74715: Calling all_inventory to load vars for managed_node1 15627 1726882469.74717: Calling groups_inventory to load vars for managed_node1 15627 1726882469.74720: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882469.74729: Calling all_plugins_play to load vars for managed_node1 15627 1726882469.74731: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882469.74734: Calling groups_plugins_play to load vars for managed_node1 15627 1726882469.76893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882469.80858: done with get_vars() 15627 1726882469.80886: done getting variables 15627 1726882469.81020: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:34:29 -0400 (0:00:01.131) 0:00:09.562 ****** 15627 1726882469.81050: entering _queue_task() for managed_node1/debug 15627 1726882469.81403: worker is 1 (out of 1 available) 15627 1726882469.81416: exiting _queue_task() for managed_node1/debug 15627 1726882469.81427: done queuing things up, now waiting for results queue to drain 15627 1726882469.81429: waiting for pending results... 15627 1726882469.82780: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 15627 1726882469.82999: in run() - task 0e448fcc-3ce9-2847-7723-000000000015 15627 1726882469.83020: variable 'ansible_search_path' from source: unknown 15627 1726882469.83027: variable 'ansible_search_path' from source: unknown 15627 1726882469.83188: calling self._execute() 15627 1726882469.83374: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882469.83393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882469.83407: variable 'omit' from source: magic vars 15627 1726882469.84166: variable 'ansible_distribution_major_version' from source: facts 15627 1726882469.84185: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882469.84197: variable 'omit' from source: magic vars 15627 1726882469.84236: variable 'omit' from source: magic vars 15627 1726882469.84461: variable 'network_provider' from source: set_fact 15627 1726882469.84601: variable 'omit' from source: magic vars 15627 1726882469.84649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882469.84700: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882469.84724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882469.84819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882469.84836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882469.84872: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882469.84914: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882469.84922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882469.85142: Set connection var ansible_timeout to 10 15627 1726882469.85158: Set connection var ansible_shell_executable to /bin/sh 15627 1726882469.85243: Set connection var ansible_connection to ssh 15627 1726882469.85258: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882469.85272: Set connection var ansible_pipelining to False 15627 1726882469.85280: Set connection var ansible_shell_type to sh 15627 1726882469.85309: variable 'ansible_shell_executable' from source: unknown 15627 1726882469.85318: variable 'ansible_connection' from source: unknown 15627 1726882469.85325: variable 'ansible_module_compression' from source: unknown 15627 1726882469.85332: variable 'ansible_shell_type' from source: unknown 15627 1726882469.85347: variable 'ansible_shell_executable' from source: unknown 15627 1726882469.85461: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882469.85475: variable 'ansible_pipelining' from source: unknown 15627 1726882469.85483: variable 'ansible_timeout' from source: unknown 15627 1726882469.85491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882469.85635: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882469.85795: variable 'omit' from source: magic vars 15627 1726882469.85805: starting attempt loop 15627 1726882469.85811: running the handler 15627 1726882469.85860: handler run complete 15627 1726882469.85885: attempt loop complete, returning result 15627 1726882469.86008: _execute() done 15627 1726882469.86015: dumping result to json 15627 1726882469.86022: done dumping result, returning 15627 1726882469.86033: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-2847-7723-000000000015] 15627 1726882469.86043: sending task result for task 0e448fcc-3ce9-2847-7723-000000000015 ok: [managed_node1] => {} MSG: Using network provider: nm 15627 1726882469.86212: no more pending results, returning what we have 15627 1726882469.86215: results queue empty 15627 1726882469.86217: checking for any_errors_fatal 15627 1726882469.86227: done checking for any_errors_fatal 15627 1726882469.86227: checking for max_fail_percentage 15627 1726882469.86229: done checking for max_fail_percentage 15627 1726882469.86230: checking to see if all hosts have failed and the running result is not ok 15627 1726882469.86231: done checking to see if all hosts have failed 15627 1726882469.86232: getting the remaining hosts for this loop 15627 1726882469.86234: done getting the remaining hosts for this loop 15627 1726882469.86238: getting the next task for host managed_node1 15627 1726882469.86248: done getting next task for host managed_node1 15627 1726882469.86252: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15627 1726882469.86257: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882469.86269: getting variables 15627 1726882469.86271: in VariableManager get_vars() 15627 1726882469.86310: Calling all_inventory to load vars for managed_node1 15627 1726882469.86313: Calling groups_inventory to load vars for managed_node1 15627 1726882469.86316: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882469.86326: Calling all_plugins_play to load vars for managed_node1 15627 1726882469.86329: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882469.86332: Calling groups_plugins_play to load vars for managed_node1 15627 1726882469.87471: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000015 15627 1726882469.87474: WORKER PROCESS EXITING 15627 1726882469.89144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882469.93372: done with get_vars() 15627 1726882469.93412: done getting variables 15627 1726882469.93650: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:34:29 -0400 (0:00:00.127) 0:00:09.689 ****** 15627 1726882469.93835: entering _queue_task() for managed_node1/fail 15627 1726882469.93837: Creating lock for fail 15627 1726882469.94721: worker is 1 (out of 1 available) 15627 1726882469.94739: exiting _queue_task() for managed_node1/fail 15627 1726882469.94758: done queuing things up, now waiting for results queue to drain 15627 1726882469.94760: waiting for pending results... 15627 1726882469.95618: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15627 1726882469.95847: in run() - task 0e448fcc-3ce9-2847-7723-000000000016 15627 1726882469.95872: variable 'ansible_search_path' from source: unknown 15627 1726882469.95897: variable 'ansible_search_path' from source: unknown 15627 1726882469.96036: calling self._execute() 15627 1726882469.96242: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882469.96257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882469.96276: variable 'omit' from source: magic vars 15627 1726882469.96995: variable 'ansible_distribution_major_version' from source: facts 15627 1726882469.97012: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882469.97242: variable 'network_state' from source: role '' defaults 15627 1726882469.97322: Evaluated conditional (network_state != {}): False 15627 1726882469.97330: when evaluation is False, skipping this task 15627 1726882469.97337: _execute() done 15627 1726882469.97345: dumping result to json 15627 1726882469.97352: done dumping result, returning 15627 1726882469.97428: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-2847-7723-000000000016] 15627 1726882469.97440: sending task result for task 0e448fcc-3ce9-2847-7723-000000000016 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882469.97627: no more pending results, returning what we have 15627 1726882469.97631: results queue empty 15627 1726882469.97632: checking for any_errors_fatal 15627 1726882469.97638: done checking for any_errors_fatal 15627 1726882469.97639: checking for max_fail_percentage 15627 1726882469.97640: done checking for max_fail_percentage 15627 1726882469.97642: checking to see if all hosts have failed and the running result is not ok 15627 1726882469.97643: done checking to see if all hosts have failed 15627 1726882469.97644: getting the remaining hosts for this loop 15627 1726882469.97648: done getting the remaining hosts for this loop 15627 1726882469.97653: getting the next task for host managed_node1 15627 1726882469.97666: done getting next task for host managed_node1 15627 1726882469.97670: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15627 1726882469.97673: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882469.97690: getting variables 15627 1726882469.97692: in VariableManager get_vars() 15627 1726882469.97743: Calling all_inventory to load vars for managed_node1 15627 1726882469.97748: Calling groups_inventory to load vars for managed_node1 15627 1726882469.97751: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882469.97773: Calling all_plugins_play to load vars for managed_node1 15627 1726882469.97777: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882469.97781: Calling groups_plugins_play to load vars for managed_node1 15627 1726882469.98752: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000016 15627 1726882469.98758: WORKER PROCESS EXITING 15627 1726882469.99990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882470.01843: done with get_vars() 15627 1726882470.01872: done getting variables 15627 1726882470.01930: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:34:30 -0400 (0:00:00.081) 0:00:09.771 ****** 15627 1726882470.01970: entering _queue_task() for managed_node1/fail 15627 1726882470.02248: worker is 1 (out of 1 available) 15627 1726882470.02260: exiting _queue_task() for managed_node1/fail 15627 1726882470.02276: done queuing things up, now waiting for results queue to drain 15627 1726882470.02277: waiting for pending results... 15627 1726882470.02551: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15627 1726882470.02687: in run() - task 0e448fcc-3ce9-2847-7723-000000000017 15627 1726882470.02714: variable 'ansible_search_path' from source: unknown 15627 1726882470.02725: variable 'ansible_search_path' from source: unknown 15627 1726882470.02778: calling self._execute() 15627 1726882470.02897: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882470.02909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882470.02926: variable 'omit' from source: magic vars 15627 1726882470.03329: variable 'ansible_distribution_major_version' from source: facts 15627 1726882470.03347: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882470.03515: variable 'network_state' from source: role '' defaults 15627 1726882470.03533: Evaluated conditional (network_state != {}): False 15627 1726882470.03541: when evaluation is False, skipping this task 15627 1726882470.03548: _execute() done 15627 1726882470.03566: dumping result to json 15627 1726882470.03576: done dumping result, returning 15627 1726882470.03595: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-2847-7723-000000000017] 15627 1726882470.03615: sending task result for task 0e448fcc-3ce9-2847-7723-000000000017 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882470.03775: no more pending results, returning what we have 15627 1726882470.03779: results queue empty 15627 1726882470.03780: checking for any_errors_fatal 15627 1726882470.03787: done checking for any_errors_fatal 15627 1726882470.03788: checking for max_fail_percentage 15627 1726882470.03789: done checking for max_fail_percentage 15627 1726882470.03790: checking to see if all hosts have failed and the running result is not ok 15627 1726882470.03792: done checking to see if all hosts have failed 15627 1726882470.03792: getting the remaining hosts for this loop 15627 1726882470.03794: done getting the remaining hosts for this loop 15627 1726882470.03798: getting the next task for host managed_node1 15627 1726882470.03806: done getting next task for host managed_node1 15627 1726882470.03810: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15627 1726882470.03812: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882470.03834: getting variables 15627 1726882470.03837: in VariableManager get_vars() 15627 1726882470.03880: Calling all_inventory to load vars for managed_node1 15627 1726882470.03883: Calling groups_inventory to load vars for managed_node1 15627 1726882470.03886: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882470.03898: Calling all_plugins_play to load vars for managed_node1 15627 1726882470.03901: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882470.03904: Calling groups_plugins_play to load vars for managed_node1 15627 1726882470.04945: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000017 15627 1726882470.04948: WORKER PROCESS EXITING 15627 1726882470.06175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882470.10447: done with get_vars() 15627 1726882470.10481: done getting variables 15627 1726882470.10546: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:34:30 -0400 (0:00:00.086) 0:00:09.857 ****** 15627 1726882470.10582: entering _queue_task() for managed_node1/fail 15627 1726882470.10895: worker is 1 (out of 1 available) 15627 1726882470.10909: exiting _queue_task() for managed_node1/fail 15627 1726882470.10920: done queuing things up, now waiting for results queue to drain 15627 1726882470.10921: waiting for pending results... 15627 1726882470.11202: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15627 1726882470.11314: in run() - task 0e448fcc-3ce9-2847-7723-000000000018 15627 1726882470.11335: variable 'ansible_search_path' from source: unknown 15627 1726882470.11343: variable 'ansible_search_path' from source: unknown 15627 1726882470.11391: calling self._execute() 15627 1726882470.11488: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882470.11499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882470.11511: variable 'omit' from source: magic vars 15627 1726882470.11897: variable 'ansible_distribution_major_version' from source: facts 15627 1726882470.11919: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882470.13016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882470.17456: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882470.17559: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882470.17606: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882470.17652: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882470.17687: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882470.17779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.17812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.17848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.17900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.17920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.18033: variable 'ansible_distribution_major_version' from source: facts 15627 1726882470.18057: Evaluated conditional (ansible_distribution_major_version | int > 9): False 15627 1726882470.18073: when evaluation is False, skipping this task 15627 1726882470.18082: _execute() done 15627 1726882470.18088: dumping result to json 15627 1726882470.18094: done dumping result, returning 15627 1726882470.18105: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-2847-7723-000000000018] 15627 1726882470.18115: sending task result for task 0e448fcc-3ce9-2847-7723-000000000018 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 15627 1726882470.18397: no more pending results, returning what we have 15627 1726882470.18401: results queue empty 15627 1726882470.18402: checking for any_errors_fatal 15627 1726882470.18408: done checking for any_errors_fatal 15627 1726882470.18408: checking for max_fail_percentage 15627 1726882470.18410: done checking for max_fail_percentage 15627 1726882470.18411: checking to see if all hosts have failed and the running result is not ok 15627 1726882470.18412: done checking to see if all hosts have failed 15627 1726882470.18413: getting the remaining hosts for this loop 15627 1726882470.18415: done getting the remaining hosts for this loop 15627 1726882470.18419: getting the next task for host managed_node1 15627 1726882470.18426: done getting next task for host managed_node1 15627 1726882470.18430: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15627 1726882470.18432: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882470.18445: getting variables 15627 1726882470.18446: in VariableManager get_vars() 15627 1726882470.18491: Calling all_inventory to load vars for managed_node1 15627 1726882470.18495: Calling groups_inventory to load vars for managed_node1 15627 1726882470.18498: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882470.18508: Calling all_plugins_play to load vars for managed_node1 15627 1726882470.18511: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882470.18514: Calling groups_plugins_play to load vars for managed_node1 15627 1726882470.19665: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000018 15627 1726882470.19669: WORKER PROCESS EXITING 15627 1726882470.21645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882470.23478: done with get_vars() 15627 1726882470.23506: done getting variables 15627 1726882470.23632: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:34:30 -0400 (0:00:00.130) 0:00:09.988 ****** 15627 1726882470.23666: entering _queue_task() for managed_node1/dnf 15627 1726882470.24346: worker is 1 (out of 1 available) 15627 1726882470.24472: exiting _queue_task() for managed_node1/dnf 15627 1726882470.24483: done queuing things up, now waiting for results queue to drain 15627 1726882470.24484: waiting for pending results... 15627 1726882470.25186: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15627 1726882470.25299: in run() - task 0e448fcc-3ce9-2847-7723-000000000019 15627 1726882470.25318: variable 'ansible_search_path' from source: unknown 15627 1726882470.25328: variable 'ansible_search_path' from source: unknown 15627 1726882470.25375: calling self._execute() 15627 1726882470.25476: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882470.25488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882470.25500: variable 'omit' from source: magic vars 15627 1726882470.25880: variable 'ansible_distribution_major_version' from source: facts 15627 1726882470.25901: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882470.26333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882470.29657: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882470.29768: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882470.29844: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882470.29893: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882470.29957: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882470.30074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.30105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.30134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.30193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.30211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.30348: variable 'ansible_distribution' from source: facts 15627 1726882470.30360: variable 'ansible_distribution_major_version' from source: facts 15627 1726882470.30394: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15627 1726882470.30528: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882470.30685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.30728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.30759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.30810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.30837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.30884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.30911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.30947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.30996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.31026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.31084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.31112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.31168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.31212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.31244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.31747: variable 'network_connections' from source: play vars 15627 1726882470.31768: variable 'interface' from source: set_fact 15627 1726882470.31845: variable 'interface' from source: set_fact 15627 1726882470.31861: variable 'interface' from source: set_fact 15627 1726882470.31933: variable 'interface' from source: set_fact 15627 1726882470.32015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882470.32490: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882470.32530: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882470.32575: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882470.32698: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882470.32743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882470.32796: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882470.32910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.32940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882470.33118: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882470.33823: variable 'network_connections' from source: play vars 15627 1726882470.33882: variable 'interface' from source: set_fact 15627 1726882470.33958: variable 'interface' from source: set_fact 15627 1726882470.33982: variable 'interface' from source: set_fact 15627 1726882470.34050: variable 'interface' from source: set_fact 15627 1726882470.34112: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15627 1726882470.34120: when evaluation is False, skipping this task 15627 1726882470.34127: _execute() done 15627 1726882470.34133: dumping result to json 15627 1726882470.34139: done dumping result, returning 15627 1726882470.34150: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-2847-7723-000000000019] 15627 1726882470.34162: sending task result for task 0e448fcc-3ce9-2847-7723-000000000019 15627 1726882470.34282: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000019 15627 1726882470.34289: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15627 1726882470.34341: no more pending results, returning what we have 15627 1726882470.34344: results queue empty 15627 1726882470.34345: checking for any_errors_fatal 15627 1726882470.34351: done checking for any_errors_fatal 15627 1726882470.34351: checking for max_fail_percentage 15627 1726882470.34353: done checking for max_fail_percentage 15627 1726882470.34357: checking to see if all hosts have failed and the running result is not ok 15627 1726882470.34358: done checking to see if all hosts have failed 15627 1726882470.34358: getting the remaining hosts for this loop 15627 1726882470.34360: done getting the remaining hosts for this loop 15627 1726882470.34366: getting the next task for host managed_node1 15627 1726882470.34374: done getting next task for host managed_node1 15627 1726882470.34378: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15627 1726882470.34380: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882470.34392: getting variables 15627 1726882470.34394: in VariableManager get_vars() 15627 1726882470.34431: Calling all_inventory to load vars for managed_node1 15627 1726882470.34434: Calling groups_inventory to load vars for managed_node1 15627 1726882470.34436: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882470.34446: Calling all_plugins_play to load vars for managed_node1 15627 1726882470.34449: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882470.34452: Calling groups_plugins_play to load vars for managed_node1 15627 1726882470.36926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882470.39520: done with get_vars() 15627 1726882470.39544: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15627 1726882470.39625: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:34:30 -0400 (0:00:00.159) 0:00:10.148 ****** 15627 1726882470.39653: entering _queue_task() for managed_node1/yum 15627 1726882470.39657: Creating lock for yum 15627 1726882470.39957: worker is 1 (out of 1 available) 15627 1726882470.39971: exiting _queue_task() for managed_node1/yum 15627 1726882470.39983: done queuing things up, now waiting for results queue to drain 15627 1726882470.39985: waiting for pending results... 15627 1726882470.40319: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15627 1726882470.41037: in run() - task 0e448fcc-3ce9-2847-7723-00000000001a 15627 1726882470.41057: variable 'ansible_search_path' from source: unknown 15627 1726882470.41071: variable 'ansible_search_path' from source: unknown 15627 1726882470.41318: calling self._execute() 15627 1726882470.41447: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882470.41465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882470.41480: variable 'omit' from source: magic vars 15627 1726882470.42977: variable 'ansible_distribution_major_version' from source: facts 15627 1726882470.43053: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882470.43975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882470.46837: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882470.46917: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882470.47167: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882470.47422: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882470.47486: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882470.47701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.47733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.47772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.47992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.48184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.48471: variable 'ansible_distribution_major_version' from source: facts 15627 1726882470.48511: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15627 1726882470.48519: when evaluation is False, skipping this task 15627 1726882470.48526: _execute() done 15627 1726882470.48531: dumping result to json 15627 1726882470.48537: done dumping result, returning 15627 1726882470.48548: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-2847-7723-00000000001a] 15627 1726882470.48564: sending task result for task 0e448fcc-3ce9-2847-7723-00000000001a skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15627 1726882470.48736: no more pending results, returning what we have 15627 1726882470.48740: results queue empty 15627 1726882470.48741: checking for any_errors_fatal 15627 1726882470.48746: done checking for any_errors_fatal 15627 1726882470.48747: checking for max_fail_percentage 15627 1726882470.48752: done checking for max_fail_percentage 15627 1726882470.48755: checking to see if all hosts have failed and the running result is not ok 15627 1726882470.48757: done checking to see if all hosts have failed 15627 1726882470.48757: getting the remaining hosts for this loop 15627 1726882470.48762: done getting the remaining hosts for this loop 15627 1726882470.48769: getting the next task for host managed_node1 15627 1726882470.48781: done getting next task for host managed_node1 15627 1726882470.48787: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15627 1726882470.48789: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882470.48809: getting variables 15627 1726882470.48812: in VariableManager get_vars() 15627 1726882470.48858: Calling all_inventory to load vars for managed_node1 15627 1726882470.48862: Calling groups_inventory to load vars for managed_node1 15627 1726882470.48868: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882470.48885: Calling all_plugins_play to load vars for managed_node1 15627 1726882470.48889: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882470.48893: Calling groups_plugins_play to load vars for managed_node1 15627 1726882470.50088: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000001a 15627 1726882470.50091: WORKER PROCESS EXITING 15627 1726882470.51228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882470.54067: done with get_vars() 15627 1726882470.54106: done getting variables 15627 1726882470.54211: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:34:30 -0400 (0:00:00.145) 0:00:10.294 ****** 15627 1726882470.54241: entering _queue_task() for managed_node1/fail 15627 1726882470.54587: worker is 1 (out of 1 available) 15627 1726882470.54599: exiting _queue_task() for managed_node1/fail 15627 1726882470.54615: done queuing things up, now waiting for results queue to drain 15627 1726882470.54617: waiting for pending results... 15627 1726882470.54886: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15627 1726882470.55098: in run() - task 0e448fcc-3ce9-2847-7723-00000000001b 15627 1726882470.55117: variable 'ansible_search_path' from source: unknown 15627 1726882470.55126: variable 'ansible_search_path' from source: unknown 15627 1726882470.55176: calling self._execute() 15627 1726882470.55270: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882470.55284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882470.55297: variable 'omit' from source: magic vars 15627 1726882470.56279: variable 'ansible_distribution_major_version' from source: facts 15627 1726882470.56295: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882470.56553: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882470.57677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882470.61523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882470.61597: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882470.61643: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882470.61691: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882470.61727: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882470.63378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.63412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.63486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.63531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.63671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.63719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.63747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.63897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.63940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.63965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.64012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.64079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.64112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.64159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.64184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.64377: variable 'network_connections' from source: play vars 15627 1726882470.64431: variable 'interface' from source: set_fact 15627 1726882470.64513: variable 'interface' from source: set_fact 15627 1726882470.64645: variable 'interface' from source: set_fact 15627 1726882470.64715: variable 'interface' from source: set_fact 15627 1726882470.64907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882470.65376: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882470.65450: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882470.65542: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882470.65581: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882470.65665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882470.65752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882470.65868: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.65900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882470.66084: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882470.66643: variable 'network_connections' from source: play vars 15627 1726882470.66657: variable 'interface' from source: set_fact 15627 1726882470.66757: variable 'interface' from source: set_fact 15627 1726882470.66824: variable 'interface' from source: set_fact 15627 1726882470.66888: variable 'interface' from source: set_fact 15627 1726882470.67003: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15627 1726882470.67044: when evaluation is False, skipping this task 15627 1726882470.67052: _execute() done 15627 1726882470.67086: dumping result to json 15627 1726882470.67095: done dumping result, returning 15627 1726882470.67108: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-2847-7723-00000000001b] 15627 1726882470.67158: sending task result for task 0e448fcc-3ce9-2847-7723-00000000001b skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15627 1726882470.67307: no more pending results, returning what we have 15627 1726882470.67311: results queue empty 15627 1726882470.67312: checking for any_errors_fatal 15627 1726882470.67316: done checking for any_errors_fatal 15627 1726882470.67317: checking for max_fail_percentage 15627 1726882470.67319: done checking for max_fail_percentage 15627 1726882470.67320: checking to see if all hosts have failed and the running result is not ok 15627 1726882470.67321: done checking to see if all hosts have failed 15627 1726882470.67322: getting the remaining hosts for this loop 15627 1726882470.67324: done getting the remaining hosts for this loop 15627 1726882470.67327: getting the next task for host managed_node1 15627 1726882470.67335: done getting next task for host managed_node1 15627 1726882470.67339: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15627 1726882470.67341: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882470.67353: getting variables 15627 1726882470.67357: in VariableManager get_vars() 15627 1726882470.67397: Calling all_inventory to load vars for managed_node1 15627 1726882470.67400: Calling groups_inventory to load vars for managed_node1 15627 1726882470.67403: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882470.67413: Calling all_plugins_play to load vars for managed_node1 15627 1726882470.67416: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882470.67419: Calling groups_plugins_play to load vars for managed_node1 15627 1726882470.68539: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000001b 15627 1726882470.68542: WORKER PROCESS EXITING 15627 1726882470.70226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882470.72917: done with get_vars() 15627 1726882470.72940: done getting variables 15627 1726882470.73002: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:34:30 -0400 (0:00:00.187) 0:00:10.482 ****** 15627 1726882470.73038: entering _queue_task() for managed_node1/package 15627 1726882470.73349: worker is 1 (out of 1 available) 15627 1726882470.73366: exiting _queue_task() for managed_node1/package 15627 1726882470.73377: done queuing things up, now waiting for results queue to drain 15627 1726882470.73378: waiting for pending results... 15627 1726882470.74377: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 15627 1726882470.74472: in run() - task 0e448fcc-3ce9-2847-7723-00000000001c 15627 1726882470.74485: variable 'ansible_search_path' from source: unknown 15627 1726882470.74488: variable 'ansible_search_path' from source: unknown 15627 1726882470.74525: calling self._execute() 15627 1726882470.74606: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882470.74612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882470.74623: variable 'omit' from source: magic vars 15627 1726882470.75813: variable 'ansible_distribution_major_version' from source: facts 15627 1726882470.75824: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882470.76218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882470.76972: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882470.77013: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882470.77045: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882470.77079: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882470.77381: variable 'network_packages' from source: role '' defaults 15627 1726882470.77483: variable '__network_provider_setup' from source: role '' defaults 15627 1726882470.77493: variable '__network_service_name_default_nm' from source: role '' defaults 15627 1726882470.77561: variable '__network_service_name_default_nm' from source: role '' defaults 15627 1726882470.77565: variable '__network_packages_default_nm' from source: role '' defaults 15627 1726882470.77908: variable '__network_packages_default_nm' from source: role '' defaults 15627 1726882470.78456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882470.81930: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882470.82688: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882470.82723: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882470.82751: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882470.82781: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882470.82860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.82890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.82915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.82956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.82968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.83010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.83033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.83058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.83094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.83108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.83520: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15627 1726882470.83762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.83917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.83942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.84865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.84881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.84967: variable 'ansible_python' from source: facts 15627 1726882470.85113: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15627 1726882470.85690: variable '__network_wpa_supplicant_required' from source: role '' defaults 15627 1726882470.85951: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15627 1726882470.86271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.86274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.86277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.86279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.86281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.86283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882470.86293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882470.86295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.86381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882470.86384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882470.86768: variable 'network_connections' from source: play vars 15627 1726882470.86776: variable 'interface' from source: set_fact 15627 1726882470.87076: variable 'interface' from source: set_fact 15627 1726882470.87087: variable 'interface' from source: set_fact 15627 1726882470.87382: variable 'interface' from source: set_fact 15627 1726882470.87443: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882470.87471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882470.87501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882470.87525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882470.87772: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882470.88379: variable 'network_connections' from source: play vars 15627 1726882470.88402: variable 'interface' from source: set_fact 15627 1726882470.88551: variable 'interface' from source: set_fact 15627 1726882470.88558: variable 'interface' from source: set_fact 15627 1726882470.88810: variable 'interface' from source: set_fact 15627 1726882470.88876: variable '__network_packages_default_wireless' from source: role '' defaults 15627 1726882470.88957: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882470.89548: variable 'network_connections' from source: play vars 15627 1726882470.89551: variable 'interface' from source: set_fact 15627 1726882470.89614: variable 'interface' from source: set_fact 15627 1726882470.89619: variable 'interface' from source: set_fact 15627 1726882470.89886: variable 'interface' from source: set_fact 15627 1726882470.89908: variable '__network_packages_default_team' from source: role '' defaults 15627 1726882470.89989: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882470.90894: variable 'network_connections' from source: play vars 15627 1726882470.90899: variable 'interface' from source: set_fact 15627 1726882470.90961: variable 'interface' from source: set_fact 15627 1726882470.90964: variable 'interface' from source: set_fact 15627 1726882470.91028: variable 'interface' from source: set_fact 15627 1726882470.91093: variable '__network_service_name_default_initscripts' from source: role '' defaults 15627 1726882470.91477: variable '__network_service_name_default_initscripts' from source: role '' defaults 15627 1726882470.91484: variable '__network_packages_default_initscripts' from source: role '' defaults 15627 1726882470.91545: variable '__network_packages_default_initscripts' from source: role '' defaults 15627 1726882470.92088: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15627 1726882470.93276: variable 'network_connections' from source: play vars 15627 1726882470.93281: variable 'interface' from source: set_fact 15627 1726882470.93338: variable 'interface' from source: set_fact 15627 1726882470.93344: variable 'interface' from source: set_fact 15627 1726882470.93401: variable 'interface' from source: set_fact 15627 1726882470.93411: variable 'ansible_distribution' from source: facts 15627 1726882470.93414: variable '__network_rh_distros' from source: role '' defaults 15627 1726882470.93419: variable 'ansible_distribution_major_version' from source: facts 15627 1726882470.93446: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15627 1726882470.93603: variable 'ansible_distribution' from source: facts 15627 1726882470.93608: variable '__network_rh_distros' from source: role '' defaults 15627 1726882470.93610: variable 'ansible_distribution_major_version' from source: facts 15627 1726882470.93621: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15627 1726882470.93783: variable 'ansible_distribution' from source: facts 15627 1726882470.93786: variable '__network_rh_distros' from source: role '' defaults 15627 1726882470.93791: variable 'ansible_distribution_major_version' from source: facts 15627 1726882470.93827: variable 'network_provider' from source: set_fact 15627 1726882470.93841: variable 'ansible_facts' from source: unknown 15627 1726882470.98334: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15627 1726882470.98338: when evaluation is False, skipping this task 15627 1726882470.98343: _execute() done 15627 1726882470.98349: dumping result to json 15627 1726882470.98355: done dumping result, returning 15627 1726882470.98433: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-2847-7723-00000000001c] 15627 1726882470.98510: sending task result for task 0e448fcc-3ce9-2847-7723-00000000001c 15627 1726882470.98612: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000001c 15627 1726882470.98615: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15627 1726882470.98666: no more pending results, returning what we have 15627 1726882470.98670: results queue empty 15627 1726882470.98671: checking for any_errors_fatal 15627 1726882470.98677: done checking for any_errors_fatal 15627 1726882470.98677: checking for max_fail_percentage 15627 1726882470.98679: done checking for max_fail_percentage 15627 1726882470.98680: checking to see if all hosts have failed and the running result is not ok 15627 1726882470.98681: done checking to see if all hosts have failed 15627 1726882470.98681: getting the remaining hosts for this loop 15627 1726882470.98683: done getting the remaining hosts for this loop 15627 1726882470.98688: getting the next task for host managed_node1 15627 1726882470.98703: done getting next task for host managed_node1 15627 1726882470.98708: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15627 1726882470.98710: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882470.98725: getting variables 15627 1726882470.98727: in VariableManager get_vars() 15627 1726882470.98794: Calling all_inventory to load vars for managed_node1 15627 1726882470.98797: Calling groups_inventory to load vars for managed_node1 15627 1726882470.98800: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882470.98816: Calling all_plugins_play to load vars for managed_node1 15627 1726882470.98819: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882470.98822: Calling groups_plugins_play to load vars for managed_node1 15627 1726882471.01276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882471.13096: done with get_vars() 15627 1726882471.13120: done getting variables 15627 1726882471.13287: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:34:31 -0400 (0:00:00.402) 0:00:10.884 ****** 15627 1726882471.13313: entering _queue_task() for managed_node1/package 15627 1726882471.14156: worker is 1 (out of 1 available) 15627 1726882471.14283: exiting _queue_task() for managed_node1/package 15627 1726882471.14294: done queuing things up, now waiting for results queue to drain 15627 1726882471.14295: waiting for pending results... 15627 1726882471.14612: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15627 1726882471.14694: in run() - task 0e448fcc-3ce9-2847-7723-00000000001d 15627 1726882471.14706: variable 'ansible_search_path' from source: unknown 15627 1726882471.14710: variable 'ansible_search_path' from source: unknown 15627 1726882471.14746: calling self._execute() 15627 1726882471.14830: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882471.14835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882471.14844: variable 'omit' from source: magic vars 15627 1726882471.16048: variable 'ansible_distribution_major_version' from source: facts 15627 1726882471.16060: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882471.16177: variable 'network_state' from source: role '' defaults 15627 1726882471.16186: Evaluated conditional (network_state != {}): False 15627 1726882471.16190: when evaluation is False, skipping this task 15627 1726882471.16193: _execute() done 15627 1726882471.16197: dumping result to json 15627 1726882471.16200: done dumping result, returning 15627 1726882471.16203: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-2847-7723-00000000001d] 15627 1726882471.16212: sending task result for task 0e448fcc-3ce9-2847-7723-00000000001d 15627 1726882471.16318: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000001d 15627 1726882471.16321: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882471.16372: no more pending results, returning what we have 15627 1726882471.16375: results queue empty 15627 1726882471.16376: checking for any_errors_fatal 15627 1726882471.16384: done checking for any_errors_fatal 15627 1726882471.16384: checking for max_fail_percentage 15627 1726882471.16386: done checking for max_fail_percentage 15627 1726882471.16387: checking to see if all hosts have failed and the running result is not ok 15627 1726882471.16388: done checking to see if all hosts have failed 15627 1726882471.16388: getting the remaining hosts for this loop 15627 1726882471.16390: done getting the remaining hosts for this loop 15627 1726882471.16394: getting the next task for host managed_node1 15627 1726882471.16400: done getting next task for host managed_node1 15627 1726882471.16404: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15627 1726882471.16405: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882471.16417: getting variables 15627 1726882471.16419: in VariableManager get_vars() 15627 1726882471.16450: Calling all_inventory to load vars for managed_node1 15627 1726882471.16453: Calling groups_inventory to load vars for managed_node1 15627 1726882471.16458: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882471.16468: Calling all_plugins_play to load vars for managed_node1 15627 1726882471.16470: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882471.16473: Calling groups_plugins_play to load vars for managed_node1 15627 1726882471.20083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882471.24987: done with get_vars() 15627 1726882471.25015: done getting variables 15627 1726882471.25197: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:34:31 -0400 (0:00:00.119) 0:00:11.003 ****** 15627 1726882471.25232: entering _queue_task() for managed_node1/package 15627 1726882471.26022: worker is 1 (out of 1 available) 15627 1726882471.26035: exiting _queue_task() for managed_node1/package 15627 1726882471.26048: done queuing things up, now waiting for results queue to drain 15627 1726882471.26049: waiting for pending results... 15627 1726882471.27188: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15627 1726882471.27274: in run() - task 0e448fcc-3ce9-2847-7723-00000000001e 15627 1726882471.27287: variable 'ansible_search_path' from source: unknown 15627 1726882471.27291: variable 'ansible_search_path' from source: unknown 15627 1726882471.27326: calling self._execute() 15627 1726882471.27412: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882471.27417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882471.27429: variable 'omit' from source: magic vars 15627 1726882471.28990: variable 'ansible_distribution_major_version' from source: facts 15627 1726882471.29003: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882471.29119: variable 'network_state' from source: role '' defaults 15627 1726882471.29127: Evaluated conditional (network_state != {}): False 15627 1726882471.29130: when evaluation is False, skipping this task 15627 1726882471.29133: _execute() done 15627 1726882471.29135: dumping result to json 15627 1726882471.29138: done dumping result, returning 15627 1726882471.29148: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-2847-7723-00000000001e] 15627 1726882471.29156: sending task result for task 0e448fcc-3ce9-2847-7723-00000000001e 15627 1726882471.29257: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000001e 15627 1726882471.29260: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882471.29329: no more pending results, returning what we have 15627 1726882471.29333: results queue empty 15627 1726882471.29334: checking for any_errors_fatal 15627 1726882471.29344: done checking for any_errors_fatal 15627 1726882471.29345: checking for max_fail_percentage 15627 1726882471.29346: done checking for max_fail_percentage 15627 1726882471.29347: checking to see if all hosts have failed and the running result is not ok 15627 1726882471.29348: done checking to see if all hosts have failed 15627 1726882471.29349: getting the remaining hosts for this loop 15627 1726882471.29350: done getting the remaining hosts for this loop 15627 1726882471.29356: getting the next task for host managed_node1 15627 1726882471.29365: done getting next task for host managed_node1 15627 1726882471.29369: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15627 1726882471.29370: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882471.29384: getting variables 15627 1726882471.29385: in VariableManager get_vars() 15627 1726882471.29417: Calling all_inventory to load vars for managed_node1 15627 1726882471.29419: Calling groups_inventory to load vars for managed_node1 15627 1726882471.29421: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882471.29430: Calling all_plugins_play to load vars for managed_node1 15627 1726882471.29433: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882471.29436: Calling groups_plugins_play to load vars for managed_node1 15627 1726882471.33444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882471.38304: done with get_vars() 15627 1726882471.38331: done getting variables 15627 1726882471.38578: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:34:31 -0400 (0:00:00.134) 0:00:11.138 ****** 15627 1726882471.38731: entering _queue_task() for managed_node1/service 15627 1726882471.38733: Creating lock for service 15627 1726882471.39604: worker is 1 (out of 1 available) 15627 1726882471.39618: exiting _queue_task() for managed_node1/service 15627 1726882471.39630: done queuing things up, now waiting for results queue to drain 15627 1726882471.39632: waiting for pending results... 15627 1726882471.40779: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15627 1726882471.41182: in run() - task 0e448fcc-3ce9-2847-7723-00000000001f 15627 1726882471.41195: variable 'ansible_search_path' from source: unknown 15627 1726882471.41199: variable 'ansible_search_path' from source: unknown 15627 1726882471.41233: calling self._execute() 15627 1726882471.41720: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882471.41725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882471.41734: variable 'omit' from source: magic vars 15627 1726882471.42886: variable 'ansible_distribution_major_version' from source: facts 15627 1726882471.42899: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882471.43017: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882471.43206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882471.50041: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882471.50118: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882471.50153: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882471.50190: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882471.50217: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882471.50696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882471.50719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882471.50741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882471.50779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882471.50791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882471.50835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882471.50859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882471.51328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882471.51367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882471.51381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882471.51420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882471.51442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882471.51467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882471.52109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882471.52123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882471.52770: variable 'network_connections' from source: play vars 15627 1726882471.52773: variable 'interface' from source: set_fact 15627 1726882471.52776: variable 'interface' from source: set_fact 15627 1726882471.52778: variable 'interface' from source: set_fact 15627 1726882471.52781: variable 'interface' from source: set_fact 15627 1726882471.52783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882471.53142: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882471.53179: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882471.53209: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882471.53238: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882471.53304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882471.53327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882471.53351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882471.53376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882471.53430: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882471.53672: variable 'network_connections' from source: play vars 15627 1726882471.53678: variable 'interface' from source: set_fact 15627 1726882471.53737: variable 'interface' from source: set_fact 15627 1726882471.53742: variable 'interface' from source: set_fact 15627 1726882471.53802: variable 'interface' from source: set_fact 15627 1726882471.53833: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15627 1726882471.53836: when evaluation is False, skipping this task 15627 1726882471.53839: _execute() done 15627 1726882471.53842: dumping result to json 15627 1726882471.53844: done dumping result, returning 15627 1726882471.53852: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-2847-7723-00000000001f] 15627 1726882471.53868: sending task result for task 0e448fcc-3ce9-2847-7723-00000000001f 15627 1726882471.54152: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000001f 15627 1726882471.54158: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15627 1726882471.54218: no more pending results, returning what we have 15627 1726882471.54221: results queue empty 15627 1726882471.54222: checking for any_errors_fatal 15627 1726882471.54229: done checking for any_errors_fatal 15627 1726882471.54230: checking for max_fail_percentage 15627 1726882471.54231: done checking for max_fail_percentage 15627 1726882471.54232: checking to see if all hosts have failed and the running result is not ok 15627 1726882471.54233: done checking to see if all hosts have failed 15627 1726882471.54233: getting the remaining hosts for this loop 15627 1726882471.54235: done getting the remaining hosts for this loop 15627 1726882471.54239: getting the next task for host managed_node1 15627 1726882471.54246: done getting next task for host managed_node1 15627 1726882471.54249: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15627 1726882471.54251: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882471.54269: getting variables 15627 1726882471.54271: in VariableManager get_vars() 15627 1726882471.54310: Calling all_inventory to load vars for managed_node1 15627 1726882471.54313: Calling groups_inventory to load vars for managed_node1 15627 1726882471.54316: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882471.54326: Calling all_plugins_play to load vars for managed_node1 15627 1726882471.54329: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882471.54332: Calling groups_plugins_play to load vars for managed_node1 15627 1726882471.57950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882471.61890: done with get_vars() 15627 1726882471.61923: done getting variables 15627 1726882471.61988: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:34:31 -0400 (0:00:00.234) 0:00:11.373 ****** 15627 1726882471.62139: entering _queue_task() for managed_node1/service 15627 1726882471.62902: worker is 1 (out of 1 available) 15627 1726882471.62918: exiting _queue_task() for managed_node1/service 15627 1726882471.62931: done queuing things up, now waiting for results queue to drain 15627 1726882471.62932: waiting for pending results... 15627 1726882471.63772: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15627 1726882471.63860: in run() - task 0e448fcc-3ce9-2847-7723-000000000020 15627 1726882471.63874: variable 'ansible_search_path' from source: unknown 15627 1726882471.63877: variable 'ansible_search_path' from source: unknown 15627 1726882471.63911: calling self._execute() 15627 1726882471.63999: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882471.64003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882471.64011: variable 'omit' from source: magic vars 15627 1726882471.64393: variable 'ansible_distribution_major_version' from source: facts 15627 1726882471.64405: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882471.64560: variable 'network_provider' from source: set_fact 15627 1726882471.64567: variable 'network_state' from source: role '' defaults 15627 1726882471.64578: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15627 1726882471.64583: variable 'omit' from source: magic vars 15627 1726882471.64621: variable 'omit' from source: magic vars 15627 1726882471.64650: variable 'network_service_name' from source: role '' defaults 15627 1726882471.64749: variable 'network_service_name' from source: role '' defaults 15627 1726882471.65084: variable '__network_provider_setup' from source: role '' defaults 15627 1726882471.65088: variable '__network_service_name_default_nm' from source: role '' defaults 15627 1726882471.65149: variable '__network_service_name_default_nm' from source: role '' defaults 15627 1726882471.65160: variable '__network_packages_default_nm' from source: role '' defaults 15627 1726882471.65220: variable '__network_packages_default_nm' from source: role '' defaults 15627 1726882471.65436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882471.68941: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882471.69012: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882471.69047: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882471.69106: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882471.69131: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882471.69630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882471.69661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882471.69689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882471.69731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882471.69745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882471.69792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882471.69814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882471.69839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882471.69880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882471.69895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882471.70118: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15627 1726882471.70223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882471.70246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882471.70273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882471.70357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882471.70401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882471.71241: variable 'ansible_python' from source: facts 15627 1726882471.71262: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15627 1726882471.71339: variable '__network_wpa_supplicant_required' from source: role '' defaults 15627 1726882471.71918: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15627 1726882471.72033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882471.72059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882471.72081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882471.72118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882471.72132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882471.72177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882471.72202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882471.72222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882471.72260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882471.72480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882471.72606: variable 'network_connections' from source: play vars 15627 1726882471.72613: variable 'interface' from source: set_fact 15627 1726882471.72688: variable 'interface' from source: set_fact 15627 1726882471.72698: variable 'interface' from source: set_fact 15627 1726882471.72766: variable 'interface' from source: set_fact 15627 1726882471.73273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882471.73461: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882471.73562: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882471.73702: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882471.73875: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882471.74028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882471.74188: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882471.74219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882471.74250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882471.74416: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882471.74911: variable 'network_connections' from source: play vars 15627 1726882471.74914: variable 'interface' from source: set_fact 15627 1726882471.75030: variable 'interface' from source: set_fact 15627 1726882471.75158: variable 'interface' from source: set_fact 15627 1726882471.75227: variable 'interface' from source: set_fact 15627 1726882471.75391: variable '__network_packages_default_wireless' from source: role '' defaults 15627 1726882471.75474: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882471.76441: variable 'network_connections' from source: play vars 15627 1726882471.76450: variable 'interface' from source: set_fact 15627 1726882471.76545: variable 'interface' from source: set_fact 15627 1726882471.76558: variable 'interface' from source: set_fact 15627 1726882471.76660: variable 'interface' from source: set_fact 15627 1726882471.76695: variable '__network_packages_default_team' from source: role '' defaults 15627 1726882471.76799: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882471.77186: variable 'network_connections' from source: play vars 15627 1726882471.77195: variable 'interface' from source: set_fact 15627 1726882471.77277: variable 'interface' from source: set_fact 15627 1726882471.77440: variable 'interface' from source: set_fact 15627 1726882471.77520: variable 'interface' from source: set_fact 15627 1726882471.77599: variable '__network_service_name_default_initscripts' from source: role '' defaults 15627 1726882471.77673: variable '__network_service_name_default_initscripts' from source: role '' defaults 15627 1726882471.77688: variable '__network_packages_default_initscripts' from source: role '' defaults 15627 1726882471.77752: variable '__network_packages_default_initscripts' from source: role '' defaults 15627 1726882471.78038: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15627 1726882471.78986: variable 'network_connections' from source: play vars 15627 1726882471.79033: variable 'interface' from source: set_fact 15627 1726882471.79111: variable 'interface' from source: set_fact 15627 1726882471.79122: variable 'interface' from source: set_fact 15627 1726882471.79189: variable 'interface' from source: set_fact 15627 1726882471.79203: variable 'ansible_distribution' from source: facts 15627 1726882471.79220: variable '__network_rh_distros' from source: role '' defaults 15627 1726882471.79230: variable 'ansible_distribution_major_version' from source: facts 15627 1726882471.79265: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15627 1726882471.79458: variable 'ansible_distribution' from source: facts 15627 1726882471.79470: variable '__network_rh_distros' from source: role '' defaults 15627 1726882471.79481: variable 'ansible_distribution_major_version' from source: facts 15627 1726882471.79496: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15627 1726882471.79692: variable 'ansible_distribution' from source: facts 15627 1726882471.79701: variable '__network_rh_distros' from source: role '' defaults 15627 1726882471.79710: variable 'ansible_distribution_major_version' from source: facts 15627 1726882471.79748: variable 'network_provider' from source: set_fact 15627 1726882471.79788: variable 'omit' from source: magic vars 15627 1726882471.79820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882471.79858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882471.79893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882471.79924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882471.79941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882471.79984: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882471.79997: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882471.80005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882471.80117: Set connection var ansible_timeout to 10 15627 1726882471.80130: Set connection var ansible_shell_executable to /bin/sh 15627 1726882471.80139: Set connection var ansible_connection to ssh 15627 1726882471.80147: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882471.80158: Set connection var ansible_pipelining to False 15627 1726882471.80167: Set connection var ansible_shell_type to sh 15627 1726882471.80201: variable 'ansible_shell_executable' from source: unknown 15627 1726882471.80213: variable 'ansible_connection' from source: unknown 15627 1726882471.80220: variable 'ansible_module_compression' from source: unknown 15627 1726882471.80226: variable 'ansible_shell_type' from source: unknown 15627 1726882471.80232: variable 'ansible_shell_executable' from source: unknown 15627 1726882471.80238: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882471.80249: variable 'ansible_pipelining' from source: unknown 15627 1726882471.80258: variable 'ansible_timeout' from source: unknown 15627 1726882471.80268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882471.80384: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882471.80400: variable 'omit' from source: magic vars 15627 1726882471.80409: starting attempt loop 15627 1726882471.80423: running the handler 15627 1726882471.80508: variable 'ansible_facts' from source: unknown 15627 1726882471.81552: _low_level_execute_command(): starting 15627 1726882471.81570: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882471.82337: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882471.82356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882471.82375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882471.82395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882471.82443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882471.82458: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882471.82476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882471.82493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882471.82505: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882471.82517: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882471.82539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882471.82558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882471.82578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882471.82591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882471.82601: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882471.82613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882471.82700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882471.82717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882471.82731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882471.82880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882471.84540: stdout chunk (state=3): >>>/root <<< 15627 1726882471.84713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882471.84770: stderr chunk (state=3): >>><<< 15627 1726882471.84773: stdout chunk (state=3): >>><<< 15627 1726882471.84870: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882471.84873: _low_level_execute_command(): starting 15627 1726882471.84876: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882471.8478897-16174-64291049894897 `" && echo ansible-tmp-1726882471.8478897-16174-64291049894897="` echo /root/.ansible/tmp/ansible-tmp-1726882471.8478897-16174-64291049894897 `" ) && sleep 0' 15627 1726882471.86318: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882471.86338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882471.86385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882471.86445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882471.86519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882471.86538: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882471.86560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882471.86590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882471.86609: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882471.86631: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882471.86647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882471.86668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882471.86685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882471.86698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882471.86710: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882471.86724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882471.86881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882471.86979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882471.87005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882471.87138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882471.89047: stdout chunk (state=3): >>>ansible-tmp-1726882471.8478897-16174-64291049894897=/root/.ansible/tmp/ansible-tmp-1726882471.8478897-16174-64291049894897 <<< 15627 1726882471.89180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882471.89246: stderr chunk (state=3): >>><<< 15627 1726882471.89266: stdout chunk (state=3): >>><<< 15627 1726882471.89375: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882471.8478897-16174-64291049894897=/root/.ansible/tmp/ansible-tmp-1726882471.8478897-16174-64291049894897 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882471.89381: variable 'ansible_module_compression' from source: unknown 15627 1726882471.89486: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 15627 1726882471.89489: ANSIBALLZ: Acquiring lock 15627 1726882471.89491: ANSIBALLZ: Lock acquired: 140251854220672 15627 1726882471.89493: ANSIBALLZ: Creating module 15627 1726882472.30845: ANSIBALLZ: Writing module into payload 15627 1726882472.31144: ANSIBALLZ: Writing module 15627 1726882472.31187: ANSIBALLZ: Renaming module 15627 1726882472.31193: ANSIBALLZ: Done creating module 15627 1726882472.31230: variable 'ansible_facts' from source: unknown 15627 1726882472.31427: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882471.8478897-16174-64291049894897/AnsiballZ_systemd.py 15627 1726882472.31641: Sending initial data 15627 1726882472.31645: Sent initial data (155 bytes) 15627 1726882472.33262: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882472.33266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882472.33289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882472.33325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882472.33331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882472.33343: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882472.33349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882472.33358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882472.33361: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882472.33384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882472.33515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882472.33531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882472.33534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882472.33743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882472.35514: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882472.35611: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882472.35715: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpv6h8jxme /root/.ansible/tmp/ansible-tmp-1726882471.8478897-16174-64291049894897/AnsiballZ_systemd.py <<< 15627 1726882472.35820: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882472.39016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882472.39236: stderr chunk (state=3): >>><<< 15627 1726882472.39307: stdout chunk (state=3): >>><<< 15627 1726882472.39422: done transferring module to remote 15627 1726882472.39433: _low_level_execute_command(): starting 15627 1726882472.39438: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882471.8478897-16174-64291049894897/ /root/.ansible/tmp/ansible-tmp-1726882471.8478897-16174-64291049894897/AnsiballZ_systemd.py && sleep 0' 15627 1726882472.41166: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882472.41234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882472.41244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882472.41259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882472.41298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882472.41327: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882472.41342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882472.41357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882472.41385: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882472.41391: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882472.41398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882472.41408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882472.41419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882472.41427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882472.41433: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882472.41444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882472.41525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882472.41538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882472.41549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882472.41681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882472.43585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882472.43608: stderr chunk (state=3): >>><<< 15627 1726882472.43611: stdout chunk (state=3): >>><<< 15627 1726882472.43627: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882472.43633: _low_level_execute_command(): starting 15627 1726882472.43636: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882471.8478897-16174-64291049894897/AnsiballZ_systemd.py && sleep 0' 15627 1726882472.44411: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882472.44426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882472.44441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882472.44462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882472.44506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882472.44519: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882472.44533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882472.44551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882472.44570: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882472.44583: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882472.44596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882472.44610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882472.44626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882472.44637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882472.44646: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882472.44662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882472.44736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882472.44756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882472.44779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882472.44914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882472.69862: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 15627 1726882472.69904: stdout chunk (state=3): >>>service", "ControlGroupId": "2455", "MemoryCurrent": "16084992", "MemoryAvailable": "infinity", "CPUUsageNSec": "769252000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSi<<< 15627 1726882472.69914: stdout chunk (state=3): >>>gnal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15627 1726882472.71500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882472.71503: stdout chunk (state=3): >>><<< 15627 1726882472.71505: stderr chunk (state=3): >>><<< 15627 1726882472.71776: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16084992", "MemoryAvailable": "infinity", "CPUUsageNSec": "769252000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882472.71786: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882471.8478897-16174-64291049894897/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882472.71789: _low_level_execute_command(): starting 15627 1726882472.71793: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882471.8478897-16174-64291049894897/ > /dev/null 2>&1 && sleep 0' 15627 1726882472.73016: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882472.73020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882472.73066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882472.73070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882472.73072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882472.73178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882472.73197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882472.73208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882472.73337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882472.75223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882472.75259: stderr chunk (state=3): >>><<< 15627 1726882472.75262: stdout chunk (state=3): >>><<< 15627 1726882472.75279: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882472.75286: handler run complete 15627 1726882472.75375: attempt loop complete, returning result 15627 1726882472.75378: _execute() done 15627 1726882472.75381: dumping result to json 15627 1726882472.75399: done dumping result, returning 15627 1726882472.75406: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-2847-7723-000000000020] 15627 1726882472.75410: sending task result for task 0e448fcc-3ce9-2847-7723-000000000020 15627 1726882472.75925: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000020 15627 1726882472.75928: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882472.75976: no more pending results, returning what we have 15627 1726882472.75979: results queue empty 15627 1726882472.75980: checking for any_errors_fatal 15627 1726882472.75985: done checking for any_errors_fatal 15627 1726882472.75985: checking for max_fail_percentage 15627 1726882472.75987: done checking for max_fail_percentage 15627 1726882472.75988: checking to see if all hosts have failed and the running result is not ok 15627 1726882472.75989: done checking to see if all hosts have failed 15627 1726882472.75989: getting the remaining hosts for this loop 15627 1726882472.75991: done getting the remaining hosts for this loop 15627 1726882472.75994: getting the next task for host managed_node1 15627 1726882472.75999: done getting next task for host managed_node1 15627 1726882472.76003: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15627 1726882472.76005: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882472.76015: getting variables 15627 1726882472.76017: in VariableManager get_vars() 15627 1726882472.76046: Calling all_inventory to load vars for managed_node1 15627 1726882472.76049: Calling groups_inventory to load vars for managed_node1 15627 1726882472.76051: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882472.76061: Calling all_plugins_play to load vars for managed_node1 15627 1726882472.76066: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882472.76070: Calling groups_plugins_play to load vars for managed_node1 15627 1726882472.79444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882472.81936: done with get_vars() 15627 1726882472.81970: done getting variables 15627 1726882472.82088: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:34:32 -0400 (0:00:01.199) 0:00:12.572 ****** 15627 1726882472.82125: entering _queue_task() for managed_node1/service 15627 1726882472.82617: worker is 1 (out of 1 available) 15627 1726882472.82642: exiting _queue_task() for managed_node1/service 15627 1726882472.82657: done queuing things up, now waiting for results queue to drain 15627 1726882472.82658: waiting for pending results... 15627 1726882472.82952: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15627 1726882472.83112: in run() - task 0e448fcc-3ce9-2847-7723-000000000021 15627 1726882472.83132: variable 'ansible_search_path' from source: unknown 15627 1726882472.83139: variable 'ansible_search_path' from source: unknown 15627 1726882472.83191: calling self._execute() 15627 1726882472.83297: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882472.83319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882472.83336: variable 'omit' from source: magic vars 15627 1726882472.83821: variable 'ansible_distribution_major_version' from source: facts 15627 1726882472.83839: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882472.84322: variable 'network_provider' from source: set_fact 15627 1726882472.84333: Evaluated conditional (network_provider == "nm"): True 15627 1726882472.84439: variable '__network_wpa_supplicant_required' from source: role '' defaults 15627 1726882472.84546: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15627 1726882472.84759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882472.88840: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882472.88929: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882472.88976: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882472.89026: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882472.89058: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882472.89567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882472.89613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882472.89793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882472.89848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882472.89999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882472.90071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882472.90214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882472.90248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882472.90334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882472.90430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882472.90542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882472.90583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882472.90682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882472.90881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882472.90908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882472.91582: variable 'network_connections' from source: play vars 15627 1726882472.91607: variable 'interface' from source: set_fact 15627 1726882472.91772: variable 'interface' from source: set_fact 15627 1726882472.91784: variable 'interface' from source: set_fact 15627 1726882472.91889: variable 'interface' from source: set_fact 15627 1726882472.91984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882472.92215: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882472.92258: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882472.92309: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882472.92333: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882472.92366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882472.92397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882472.92416: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882472.92435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882472.92473: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882472.92643: variable 'network_connections' from source: play vars 15627 1726882472.92646: variable 'interface' from source: set_fact 15627 1726882472.92694: variable 'interface' from source: set_fact 15627 1726882472.92700: variable 'interface' from source: set_fact 15627 1726882472.92744: variable 'interface' from source: set_fact 15627 1726882472.92777: Evaluated conditional (__network_wpa_supplicant_required): False 15627 1726882472.92780: when evaluation is False, skipping this task 15627 1726882472.92783: _execute() done 15627 1726882472.92793: dumping result to json 15627 1726882472.92796: done dumping result, returning 15627 1726882472.92799: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-2847-7723-000000000021] 15627 1726882472.92801: sending task result for task 0e448fcc-3ce9-2847-7723-000000000021 15627 1726882472.92890: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000021 15627 1726882472.92893: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15627 1726882472.92934: no more pending results, returning what we have 15627 1726882472.92937: results queue empty 15627 1726882472.92938: checking for any_errors_fatal 15627 1726882472.92960: done checking for any_errors_fatal 15627 1726882472.92961: checking for max_fail_percentage 15627 1726882472.92963: done checking for max_fail_percentage 15627 1726882472.92965: checking to see if all hosts have failed and the running result is not ok 15627 1726882472.92966: done checking to see if all hosts have failed 15627 1726882472.92967: getting the remaining hosts for this loop 15627 1726882472.92968: done getting the remaining hosts for this loop 15627 1726882472.92972: getting the next task for host managed_node1 15627 1726882472.92979: done getting next task for host managed_node1 15627 1726882472.92983: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15627 1726882472.92985: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882472.92997: getting variables 15627 1726882472.92998: in VariableManager get_vars() 15627 1726882472.93057: Calling all_inventory to load vars for managed_node1 15627 1726882472.93060: Calling groups_inventory to load vars for managed_node1 15627 1726882472.93062: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882472.93075: Calling all_plugins_play to load vars for managed_node1 15627 1726882472.93078: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882472.93080: Calling groups_plugins_play to load vars for managed_node1 15627 1726882472.94090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882472.97011: done with get_vars() 15627 1726882472.97047: done getting variables 15627 1726882472.97302: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:34:32 -0400 (0:00:00.152) 0:00:12.725 ****** 15627 1726882472.97345: entering _queue_task() for managed_node1/service 15627 1726882472.97929: worker is 1 (out of 1 available) 15627 1726882472.97959: exiting _queue_task() for managed_node1/service 15627 1726882472.97972: done queuing things up, now waiting for results queue to drain 15627 1726882472.97974: waiting for pending results... 15627 1726882472.98309: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 15627 1726882472.98493: in run() - task 0e448fcc-3ce9-2847-7723-000000000022 15627 1726882472.98525: variable 'ansible_search_path' from source: unknown 15627 1726882472.98534: variable 'ansible_search_path' from source: unknown 15627 1726882472.98578: calling self._execute() 15627 1726882472.98691: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882472.98705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882472.98727: variable 'omit' from source: magic vars 15627 1726882472.99086: variable 'ansible_distribution_major_version' from source: facts 15627 1726882472.99096: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882472.99177: variable 'network_provider' from source: set_fact 15627 1726882472.99181: Evaluated conditional (network_provider == "initscripts"): False 15627 1726882472.99185: when evaluation is False, skipping this task 15627 1726882472.99188: _execute() done 15627 1726882472.99191: dumping result to json 15627 1726882472.99193: done dumping result, returning 15627 1726882472.99199: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-2847-7723-000000000022] 15627 1726882472.99205: sending task result for task 0e448fcc-3ce9-2847-7723-000000000022 15627 1726882472.99288: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000022 15627 1726882472.99290: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882472.99340: no more pending results, returning what we have 15627 1726882472.99344: results queue empty 15627 1726882472.99344: checking for any_errors_fatal 15627 1726882472.99357: done checking for any_errors_fatal 15627 1726882472.99358: checking for max_fail_percentage 15627 1726882472.99360: done checking for max_fail_percentage 15627 1726882472.99361: checking to see if all hosts have failed and the running result is not ok 15627 1726882472.99362: done checking to see if all hosts have failed 15627 1726882472.99363: getting the remaining hosts for this loop 15627 1726882472.99366: done getting the remaining hosts for this loop 15627 1726882472.99370: getting the next task for host managed_node1 15627 1726882472.99376: done getting next task for host managed_node1 15627 1726882472.99380: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15627 1726882472.99382: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882472.99395: getting variables 15627 1726882472.99397: in VariableManager get_vars() 15627 1726882472.99429: Calling all_inventory to load vars for managed_node1 15627 1726882472.99431: Calling groups_inventory to load vars for managed_node1 15627 1726882472.99433: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882472.99441: Calling all_plugins_play to load vars for managed_node1 15627 1726882472.99443: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882472.99446: Calling groups_plugins_play to load vars for managed_node1 15627 1726882473.00347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882473.02762: done with get_vars() 15627 1726882473.02800: done getting variables 15627 1726882473.02880: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:34:33 -0400 (0:00:00.055) 0:00:12.780 ****** 15627 1726882473.02910: entering _queue_task() for managed_node1/copy 15627 1726882473.03219: worker is 1 (out of 1 available) 15627 1726882473.03234: exiting _queue_task() for managed_node1/copy 15627 1726882473.03252: done queuing things up, now waiting for results queue to drain 15627 1726882473.03255: waiting for pending results... 15627 1726882473.03669: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15627 1726882473.03769: in run() - task 0e448fcc-3ce9-2847-7723-000000000023 15627 1726882473.03774: variable 'ansible_search_path' from source: unknown 15627 1726882473.03777: variable 'ansible_search_path' from source: unknown 15627 1726882473.03805: calling self._execute() 15627 1726882473.03901: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882473.03905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882473.03916: variable 'omit' from source: magic vars 15627 1726882473.04425: variable 'ansible_distribution_major_version' from source: facts 15627 1726882473.04429: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882473.04536: variable 'network_provider' from source: set_fact 15627 1726882473.04541: Evaluated conditional (network_provider == "initscripts"): False 15627 1726882473.04545: when evaluation is False, skipping this task 15627 1726882473.04548: _execute() done 15627 1726882473.04551: dumping result to json 15627 1726882473.04553: done dumping result, returning 15627 1726882473.04567: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-2847-7723-000000000023] 15627 1726882473.04582: sending task result for task 0e448fcc-3ce9-2847-7723-000000000023 15627 1726882473.04676: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000023 15627 1726882473.04679: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15627 1726882473.04754: no more pending results, returning what we have 15627 1726882473.04758: results queue empty 15627 1726882473.04759: checking for any_errors_fatal 15627 1726882473.04767: done checking for any_errors_fatal 15627 1726882473.04767: checking for max_fail_percentage 15627 1726882473.04769: done checking for max_fail_percentage 15627 1726882473.04770: checking to see if all hosts have failed and the running result is not ok 15627 1726882473.04771: done checking to see if all hosts have failed 15627 1726882473.04772: getting the remaining hosts for this loop 15627 1726882473.04773: done getting the remaining hosts for this loop 15627 1726882473.04776: getting the next task for host managed_node1 15627 1726882473.04784: done getting next task for host managed_node1 15627 1726882473.04787: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15627 1726882473.04789: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882473.04800: getting variables 15627 1726882473.04802: in VariableManager get_vars() 15627 1726882473.04839: Calling all_inventory to load vars for managed_node1 15627 1726882473.04842: Calling groups_inventory to load vars for managed_node1 15627 1726882473.04845: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882473.04853: Calling all_plugins_play to load vars for managed_node1 15627 1726882473.04855: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882473.04858: Calling groups_plugins_play to load vars for managed_node1 15627 1726882473.06216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882473.08616: done with get_vars() 15627 1726882473.08637: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:34:33 -0400 (0:00:00.058) 0:00:12.838 ****** 15627 1726882473.08720: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15627 1726882473.08722: Creating lock for fedora.linux_system_roles.network_connections 15627 1726882473.08966: worker is 1 (out of 1 available) 15627 1726882473.08981: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15627 1726882473.08992: done queuing things up, now waiting for results queue to drain 15627 1726882473.08993: waiting for pending results... 15627 1726882473.09162: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15627 1726882473.09232: in run() - task 0e448fcc-3ce9-2847-7723-000000000024 15627 1726882473.09244: variable 'ansible_search_path' from source: unknown 15627 1726882473.09247: variable 'ansible_search_path' from source: unknown 15627 1726882473.09278: calling self._execute() 15627 1726882473.09351: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882473.09359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882473.09368: variable 'omit' from source: magic vars 15627 1726882473.09635: variable 'ansible_distribution_major_version' from source: facts 15627 1726882473.09646: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882473.09649: variable 'omit' from source: magic vars 15627 1726882473.09680: variable 'omit' from source: magic vars 15627 1726882473.09792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882473.11761: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882473.11828: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882473.11868: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882473.11903: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882473.11931: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882473.12013: variable 'network_provider' from source: set_fact 15627 1726882473.12141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882473.12186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882473.12211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882473.12250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882473.12271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882473.12339: variable 'omit' from source: magic vars 15627 1726882473.12450: variable 'omit' from source: magic vars 15627 1726882473.12552: variable 'network_connections' from source: play vars 15627 1726882473.12567: variable 'interface' from source: set_fact 15627 1726882473.12632: variable 'interface' from source: set_fact 15627 1726882473.12639: variable 'interface' from source: set_fact 15627 1726882473.12702: variable 'interface' from source: set_fact 15627 1726882473.12858: variable 'omit' from source: magic vars 15627 1726882473.12866: variable '__lsr_ansible_managed' from source: task vars 15627 1726882473.12924: variable '__lsr_ansible_managed' from source: task vars 15627 1726882473.13094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15627 1726882473.13296: Loaded config def from plugin (lookup/template) 15627 1726882473.13299: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15627 1726882473.13327: File lookup term: get_ansible_managed.j2 15627 1726882473.13330: variable 'ansible_search_path' from source: unknown 15627 1726882473.13333: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15627 1726882473.13349: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15627 1726882473.13363: variable 'ansible_search_path' from source: unknown 15627 1726882473.19250: variable 'ansible_managed' from source: unknown 15627 1726882473.19337: variable 'omit' from source: magic vars 15627 1726882473.19359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882473.19379: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882473.19393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882473.19410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882473.19417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882473.19440: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882473.19443: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882473.19446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882473.19511: Set connection var ansible_timeout to 10 15627 1726882473.19521: Set connection var ansible_shell_executable to /bin/sh 15627 1726882473.19524: Set connection var ansible_connection to ssh 15627 1726882473.19527: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882473.19533: Set connection var ansible_pipelining to False 15627 1726882473.19535: Set connection var ansible_shell_type to sh 15627 1726882473.19553: variable 'ansible_shell_executable' from source: unknown 15627 1726882473.19558: variable 'ansible_connection' from source: unknown 15627 1726882473.19560: variable 'ansible_module_compression' from source: unknown 15627 1726882473.19563: variable 'ansible_shell_type' from source: unknown 15627 1726882473.19567: variable 'ansible_shell_executable' from source: unknown 15627 1726882473.19569: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882473.19572: variable 'ansible_pipelining' from source: unknown 15627 1726882473.19574: variable 'ansible_timeout' from source: unknown 15627 1726882473.19576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882473.19683: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882473.19694: variable 'omit' from source: magic vars 15627 1726882473.19697: starting attempt loop 15627 1726882473.19699: running the handler 15627 1726882473.19710: _low_level_execute_command(): starting 15627 1726882473.19716: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882473.20203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882473.20218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882473.20231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882473.20242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882473.20251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882473.20303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882473.20311: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882473.20317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882473.20432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882473.22109: stdout chunk (state=3): >>>/root <<< 15627 1726882473.22237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882473.22276: stderr chunk (state=3): >>><<< 15627 1726882473.22279: stdout chunk (state=3): >>><<< 15627 1726882473.22317: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882473.22321: _low_level_execute_command(): starting 15627 1726882473.22331: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882473.2230964-16241-213776482596323 `" && echo ansible-tmp-1726882473.2230964-16241-213776482596323="` echo /root/.ansible/tmp/ansible-tmp-1726882473.2230964-16241-213776482596323 `" ) && sleep 0' 15627 1726882473.22760: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882473.22767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882473.22795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882473.22802: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882473.22816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882473.22825: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882473.22834: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882473.22839: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882473.22851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882473.22859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882473.22870: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882473.22875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882473.22927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882473.22948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882473.22956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882473.23053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882473.25171: stdout chunk (state=3): >>>ansible-tmp-1726882473.2230964-16241-213776482596323=/root/.ansible/tmp/ansible-tmp-1726882473.2230964-16241-213776482596323 <<< 15627 1726882473.25174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882473.25381: stderr chunk (state=3): >>><<< 15627 1726882473.25385: stdout chunk (state=3): >>><<< 15627 1726882473.25388: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882473.2230964-16241-213776482596323=/root/.ansible/tmp/ansible-tmp-1726882473.2230964-16241-213776482596323 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882473.25390: variable 'ansible_module_compression' from source: unknown 15627 1726882473.25392: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 15627 1726882473.25394: ANSIBALLZ: Acquiring lock 15627 1726882473.25396: ANSIBALLZ: Lock acquired: 140251851990624 15627 1726882473.25398: ANSIBALLZ: Creating module 15627 1726882473.55717: ANSIBALLZ: Writing module into payload 15627 1726882473.56630: ANSIBALLZ: Writing module 15627 1726882473.56780: ANSIBALLZ: Renaming module 15627 1726882473.56787: ANSIBALLZ: Done creating module 15627 1726882473.56811: variable 'ansible_facts' from source: unknown 15627 1726882473.57033: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882473.2230964-16241-213776482596323/AnsiballZ_network_connections.py 15627 1726882473.57919: Sending initial data 15627 1726882473.57922: Sent initial data (168 bytes) 15627 1726882473.59003: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882473.59014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882473.59028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882473.59042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882473.59090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882473.59099: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882473.59111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882473.59124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882473.59133: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882473.59140: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882473.59149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882473.59167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882473.59182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882473.59193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882473.59200: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882473.59210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882473.59295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882473.59312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882473.59318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882473.59452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882473.61283: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 15627 1726882473.61291: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882473.61379: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882473.61479: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpttpbk4ll /root/.ansible/tmp/ansible-tmp-1726882473.2230964-16241-213776482596323/AnsiballZ_network_connections.py <<< 15627 1726882473.61573: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882473.63691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882473.63694: stderr chunk (state=3): >>><<< 15627 1726882473.63699: stdout chunk (state=3): >>><<< 15627 1726882473.63701: done transferring module to remote 15627 1726882473.63704: _low_level_execute_command(): starting 15627 1726882473.63706: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882473.2230964-16241-213776482596323/ /root/.ansible/tmp/ansible-tmp-1726882473.2230964-16241-213776482596323/AnsiballZ_network_connections.py && sleep 0' 15627 1726882473.64402: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882473.64417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882473.64454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882473.64460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882473.64495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882473.64498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882473.64589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882473.64593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882473.64605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882473.64735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882473.66526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882473.66568: stderr chunk (state=3): >>><<< 15627 1726882473.66571: stdout chunk (state=3): >>><<< 15627 1726882473.66583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882473.66586: _low_level_execute_command(): starting 15627 1726882473.66588: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882473.2230964-16241-213776482596323/AnsiballZ_network_connections.py && sleep 0' 15627 1726882473.67006: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882473.67009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882473.67043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882473.67049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882473.67052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882473.67102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882473.67105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882473.67207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882473.94534: stdout chunk (state=3): >>> <<< 15627 1726882473.94538: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 8673f01a-a0f2-4871-9987-eca35b758d19\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 8673f01a-a0f2-4871-9987-eca35b758d19 (is-modified)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15627 1726882473.96414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882473.96478: stderr chunk (state=3): >>><<< 15627 1726882473.96481: stdout chunk (state=3): >>><<< 15627 1726882473.96497: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 8673f01a-a0f2-4871-9987-eca35b758d19\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 8673f01a-a0f2-4871-9987-eca35b758d19 (is-modified)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882473.96528: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'interface_name': 'LSR-TST-br31', 'state': 'up', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882473.2230964-16241-213776482596323/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882473.96536: _low_level_execute_command(): starting 15627 1726882473.96540: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882473.2230964-16241-213776482596323/ > /dev/null 2>&1 && sleep 0' 15627 1726882473.97021: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882473.97027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882473.97059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882473.97068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 15627 1726882473.97076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882473.97081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882473.97093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882473.97102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882473.97157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882473.97161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882473.97175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882473.97278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882473.99133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882473.99184: stderr chunk (state=3): >>><<< 15627 1726882473.99187: stdout chunk (state=3): >>><<< 15627 1726882473.99199: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882473.99207: handler run complete 15627 1726882473.99227: attempt loop complete, returning result 15627 1726882473.99230: _execute() done 15627 1726882473.99232: dumping result to json 15627 1726882473.99237: done dumping result, returning 15627 1726882473.99248: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-2847-7723-000000000024] 15627 1726882473.99250: sending task result for task 0e448fcc-3ce9-2847-7723-000000000024 15627 1726882473.99359: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000024 15627 1726882473.99363: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 8673f01a-a0f2-4871-9987-eca35b758d19 [004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 8673f01a-a0f2-4871-9987-eca35b758d19 (is-modified) 15627 1726882473.99486: no more pending results, returning what we have 15627 1726882473.99489: results queue empty 15627 1726882473.99490: checking for any_errors_fatal 15627 1726882473.99497: done checking for any_errors_fatal 15627 1726882473.99498: checking for max_fail_percentage 15627 1726882473.99500: done checking for max_fail_percentage 15627 1726882473.99501: checking to see if all hosts have failed and the running result is not ok 15627 1726882473.99502: done checking to see if all hosts have failed 15627 1726882473.99502: getting the remaining hosts for this loop 15627 1726882473.99504: done getting the remaining hosts for this loop 15627 1726882473.99508: getting the next task for host managed_node1 15627 1726882473.99514: done getting next task for host managed_node1 15627 1726882473.99518: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15627 1726882473.99519: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882473.99528: getting variables 15627 1726882473.99529: in VariableManager get_vars() 15627 1726882473.99563: Calling all_inventory to load vars for managed_node1 15627 1726882473.99574: Calling groups_inventory to load vars for managed_node1 15627 1726882473.99577: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882473.99586: Calling all_plugins_play to load vars for managed_node1 15627 1726882473.99588: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882473.99591: Calling groups_plugins_play to load vars for managed_node1 15627 1726882474.00422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882474.01452: done with get_vars() 15627 1726882474.01471: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:34:34 -0400 (0:00:00.928) 0:00:13.766 ****** 15627 1726882474.01531: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15627 1726882474.01533: Creating lock for fedora.linux_system_roles.network_state 15627 1726882474.01746: worker is 1 (out of 1 available) 15627 1726882474.01761: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15627 1726882474.01773: done queuing things up, now waiting for results queue to drain 15627 1726882474.01775: waiting for pending results... 15627 1726882474.01944: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 15627 1726882474.02016: in run() - task 0e448fcc-3ce9-2847-7723-000000000025 15627 1726882474.02027: variable 'ansible_search_path' from source: unknown 15627 1726882474.02031: variable 'ansible_search_path' from source: unknown 15627 1726882474.02062: calling self._execute() 15627 1726882474.02130: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.02134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.02143: variable 'omit' from source: magic vars 15627 1726882474.02419: variable 'ansible_distribution_major_version' from source: facts 15627 1726882474.02429: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882474.02514: variable 'network_state' from source: role '' defaults 15627 1726882474.02522: Evaluated conditional (network_state != {}): False 15627 1726882474.02525: when evaluation is False, skipping this task 15627 1726882474.02528: _execute() done 15627 1726882474.02530: dumping result to json 15627 1726882474.02534: done dumping result, returning 15627 1726882474.02542: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-2847-7723-000000000025] 15627 1726882474.02546: sending task result for task 0e448fcc-3ce9-2847-7723-000000000025 15627 1726882474.02632: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000025 15627 1726882474.02634: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882474.02703: no more pending results, returning what we have 15627 1726882474.02706: results queue empty 15627 1726882474.02707: checking for any_errors_fatal 15627 1726882474.02716: done checking for any_errors_fatal 15627 1726882474.02717: checking for max_fail_percentage 15627 1726882474.02718: done checking for max_fail_percentage 15627 1726882474.02719: checking to see if all hosts have failed and the running result is not ok 15627 1726882474.02720: done checking to see if all hosts have failed 15627 1726882474.02721: getting the remaining hosts for this loop 15627 1726882474.02722: done getting the remaining hosts for this loop 15627 1726882474.02725: getting the next task for host managed_node1 15627 1726882474.02731: done getting next task for host managed_node1 15627 1726882474.02734: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15627 1726882474.02736: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882474.02756: getting variables 15627 1726882474.02757: in VariableManager get_vars() 15627 1726882474.02785: Calling all_inventory to load vars for managed_node1 15627 1726882474.02787: Calling groups_inventory to load vars for managed_node1 15627 1726882474.02788: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882474.02795: Calling all_plugins_play to load vars for managed_node1 15627 1726882474.02797: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882474.02798: Calling groups_plugins_play to load vars for managed_node1 15627 1726882474.03579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882474.04521: done with get_vars() 15627 1726882474.04535: done getting variables 15627 1726882474.04578: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:34:34 -0400 (0:00:00.030) 0:00:13.797 ****** 15627 1726882474.04602: entering _queue_task() for managed_node1/debug 15627 1726882474.04790: worker is 1 (out of 1 available) 15627 1726882474.04804: exiting _queue_task() for managed_node1/debug 15627 1726882474.04815: done queuing things up, now waiting for results queue to drain 15627 1726882474.04817: waiting for pending results... 15627 1726882474.04985: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15627 1726882474.05051: in run() - task 0e448fcc-3ce9-2847-7723-000000000026 15627 1726882474.05067: variable 'ansible_search_path' from source: unknown 15627 1726882474.05070: variable 'ansible_search_path' from source: unknown 15627 1726882474.05096: calling self._execute() 15627 1726882474.05168: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.05171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.05180: variable 'omit' from source: magic vars 15627 1726882474.05436: variable 'ansible_distribution_major_version' from source: facts 15627 1726882474.05446: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882474.05451: variable 'omit' from source: magic vars 15627 1726882474.05485: variable 'omit' from source: magic vars 15627 1726882474.05508: variable 'omit' from source: magic vars 15627 1726882474.05537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882474.05569: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882474.05585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882474.05598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882474.05608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882474.05630: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882474.05633: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.05635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.05710: Set connection var ansible_timeout to 10 15627 1726882474.05716: Set connection var ansible_shell_executable to /bin/sh 15627 1726882474.05721: Set connection var ansible_connection to ssh 15627 1726882474.05726: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882474.05731: Set connection var ansible_pipelining to False 15627 1726882474.05733: Set connection var ansible_shell_type to sh 15627 1726882474.05750: variable 'ansible_shell_executable' from source: unknown 15627 1726882474.05753: variable 'ansible_connection' from source: unknown 15627 1726882474.05756: variable 'ansible_module_compression' from source: unknown 15627 1726882474.05761: variable 'ansible_shell_type' from source: unknown 15627 1726882474.05764: variable 'ansible_shell_executable' from source: unknown 15627 1726882474.05767: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.05771: variable 'ansible_pipelining' from source: unknown 15627 1726882474.05773: variable 'ansible_timeout' from source: unknown 15627 1726882474.05777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.05880: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882474.05889: variable 'omit' from source: magic vars 15627 1726882474.05893: starting attempt loop 15627 1726882474.05896: running the handler 15627 1726882474.05994: variable '__network_connections_result' from source: set_fact 15627 1726882474.06039: handler run complete 15627 1726882474.06052: attempt loop complete, returning result 15627 1726882474.06055: _execute() done 15627 1726882474.06059: dumping result to json 15627 1726882474.06062: done dumping result, returning 15627 1726882474.06073: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-2847-7723-000000000026] 15627 1726882474.06078: sending task result for task 0e448fcc-3ce9-2847-7723-000000000026 15627 1726882474.06163: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000026 15627 1726882474.06167: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 8673f01a-a0f2-4871-9987-eca35b758d19", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 8673f01a-a0f2-4871-9987-eca35b758d19 (is-modified)" ] } 15627 1726882474.06229: no more pending results, returning what we have 15627 1726882474.06233: results queue empty 15627 1726882474.06238: checking for any_errors_fatal 15627 1726882474.06242: done checking for any_errors_fatal 15627 1726882474.06243: checking for max_fail_percentage 15627 1726882474.06244: done checking for max_fail_percentage 15627 1726882474.06245: checking to see if all hosts have failed and the running result is not ok 15627 1726882474.06246: done checking to see if all hosts have failed 15627 1726882474.06247: getting the remaining hosts for this loop 15627 1726882474.06248: done getting the remaining hosts for this loop 15627 1726882474.06252: getting the next task for host managed_node1 15627 1726882474.06258: done getting next task for host managed_node1 15627 1726882474.06262: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15627 1726882474.06265: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882474.06273: getting variables 15627 1726882474.06275: in VariableManager get_vars() 15627 1726882474.06306: Calling all_inventory to load vars for managed_node1 15627 1726882474.06308: Calling groups_inventory to load vars for managed_node1 15627 1726882474.06310: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882474.06318: Calling all_plugins_play to load vars for managed_node1 15627 1726882474.06321: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882474.06323: Calling groups_plugins_play to load vars for managed_node1 15627 1726882474.07218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882474.08134: done with get_vars() 15627 1726882474.08148: done getting variables 15627 1726882474.08195: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:34:34 -0400 (0:00:00.036) 0:00:13.833 ****** 15627 1726882474.08215: entering _queue_task() for managed_node1/debug 15627 1726882474.08409: worker is 1 (out of 1 available) 15627 1726882474.08423: exiting _queue_task() for managed_node1/debug 15627 1726882474.08434: done queuing things up, now waiting for results queue to drain 15627 1726882474.08435: waiting for pending results... 15627 1726882474.08608: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15627 1726882474.08684: in run() - task 0e448fcc-3ce9-2847-7723-000000000027 15627 1726882474.08696: variable 'ansible_search_path' from source: unknown 15627 1726882474.08699: variable 'ansible_search_path' from source: unknown 15627 1726882474.08730: calling self._execute() 15627 1726882474.08798: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.08802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.08810: variable 'omit' from source: magic vars 15627 1726882474.09075: variable 'ansible_distribution_major_version' from source: facts 15627 1726882474.09084: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882474.09092: variable 'omit' from source: magic vars 15627 1726882474.09118: variable 'omit' from source: magic vars 15627 1726882474.09141: variable 'omit' from source: magic vars 15627 1726882474.09176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882474.09202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882474.09217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882474.09231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882474.09241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882474.09269: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882474.09272: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.09282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.09340: Set connection var ansible_timeout to 10 15627 1726882474.09347: Set connection var ansible_shell_executable to /bin/sh 15627 1726882474.09351: Set connection var ansible_connection to ssh 15627 1726882474.09359: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882474.09365: Set connection var ansible_pipelining to False 15627 1726882474.09368: Set connection var ansible_shell_type to sh 15627 1726882474.09388: variable 'ansible_shell_executable' from source: unknown 15627 1726882474.09391: variable 'ansible_connection' from source: unknown 15627 1726882474.09394: variable 'ansible_module_compression' from source: unknown 15627 1726882474.09396: variable 'ansible_shell_type' from source: unknown 15627 1726882474.09398: variable 'ansible_shell_executable' from source: unknown 15627 1726882474.09401: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.09403: variable 'ansible_pipelining' from source: unknown 15627 1726882474.09405: variable 'ansible_timeout' from source: unknown 15627 1726882474.09407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.09506: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882474.09514: variable 'omit' from source: magic vars 15627 1726882474.09519: starting attempt loop 15627 1726882474.09521: running the handler 15627 1726882474.09562: variable '__network_connections_result' from source: set_fact 15627 1726882474.09614: variable '__network_connections_result' from source: set_fact 15627 1726882474.09697: handler run complete 15627 1726882474.09716: attempt loop complete, returning result 15627 1726882474.09719: _execute() done 15627 1726882474.09722: dumping result to json 15627 1726882474.09724: done dumping result, returning 15627 1726882474.09732: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-2847-7723-000000000027] 15627 1726882474.09736: sending task result for task 0e448fcc-3ce9-2847-7723-000000000027 15627 1726882474.09829: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000027 15627 1726882474.09832: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 8673f01a-a0f2-4871-9987-eca35b758d19\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 8673f01a-a0f2-4871-9987-eca35b758d19 (is-modified)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 8673f01a-a0f2-4871-9987-eca35b758d19", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 8673f01a-a0f2-4871-9987-eca35b758d19 (is-modified)" ] } } 15627 1726882474.09932: no more pending results, returning what we have 15627 1726882474.09935: results queue empty 15627 1726882474.09936: checking for any_errors_fatal 15627 1726882474.09940: done checking for any_errors_fatal 15627 1726882474.09940: checking for max_fail_percentage 15627 1726882474.09942: done checking for max_fail_percentage 15627 1726882474.09942: checking to see if all hosts have failed and the running result is not ok 15627 1726882474.09943: done checking to see if all hosts have failed 15627 1726882474.09944: getting the remaining hosts for this loop 15627 1726882474.09945: done getting the remaining hosts for this loop 15627 1726882474.09948: getting the next task for host managed_node1 15627 1726882474.09953: done getting next task for host managed_node1 15627 1726882474.09961: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15627 1726882474.09962: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882474.09970: getting variables 15627 1726882474.09971: in VariableManager get_vars() 15627 1726882474.09993: Calling all_inventory to load vars for managed_node1 15627 1726882474.09995: Calling groups_inventory to load vars for managed_node1 15627 1726882474.09996: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882474.10002: Calling all_plugins_play to load vars for managed_node1 15627 1726882474.10004: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882474.10005: Calling groups_plugins_play to load vars for managed_node1 15627 1726882474.10777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882474.11711: done with get_vars() 15627 1726882474.11727: done getting variables 15627 1726882474.11769: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:34:34 -0400 (0:00:00.035) 0:00:13.869 ****** 15627 1726882474.11788: entering _queue_task() for managed_node1/debug 15627 1726882474.11978: worker is 1 (out of 1 available) 15627 1726882474.11992: exiting _queue_task() for managed_node1/debug 15627 1726882474.12005: done queuing things up, now waiting for results queue to drain 15627 1726882474.12007: waiting for pending results... 15627 1726882474.12169: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15627 1726882474.12239: in run() - task 0e448fcc-3ce9-2847-7723-000000000028 15627 1726882474.12250: variable 'ansible_search_path' from source: unknown 15627 1726882474.12254: variable 'ansible_search_path' from source: unknown 15627 1726882474.12289: calling self._execute() 15627 1726882474.12359: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.12366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.12372: variable 'omit' from source: magic vars 15627 1726882474.12636: variable 'ansible_distribution_major_version' from source: facts 15627 1726882474.12646: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882474.12743: variable 'network_state' from source: role '' defaults 15627 1726882474.12749: Evaluated conditional (network_state != {}): False 15627 1726882474.12752: when evaluation is False, skipping this task 15627 1726882474.12757: _execute() done 15627 1726882474.12759: dumping result to json 15627 1726882474.12761: done dumping result, returning 15627 1726882474.12770: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-2847-7723-000000000028] 15627 1726882474.12776: sending task result for task 0e448fcc-3ce9-2847-7723-000000000028 15627 1726882474.12852: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000028 15627 1726882474.12858: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 15627 1726882474.12923: no more pending results, returning what we have 15627 1726882474.12926: results queue empty 15627 1726882474.12927: checking for any_errors_fatal 15627 1726882474.12934: done checking for any_errors_fatal 15627 1726882474.12935: checking for max_fail_percentage 15627 1726882474.12936: done checking for max_fail_percentage 15627 1726882474.12937: checking to see if all hosts have failed and the running result is not ok 15627 1726882474.12938: done checking to see if all hosts have failed 15627 1726882474.12939: getting the remaining hosts for this loop 15627 1726882474.12940: done getting the remaining hosts for this loop 15627 1726882474.12943: getting the next task for host managed_node1 15627 1726882474.12948: done getting next task for host managed_node1 15627 1726882474.12951: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15627 1726882474.12953: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882474.12970: getting variables 15627 1726882474.12971: in VariableManager get_vars() 15627 1726882474.12995: Calling all_inventory to load vars for managed_node1 15627 1726882474.12997: Calling groups_inventory to load vars for managed_node1 15627 1726882474.12999: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882474.13005: Calling all_plugins_play to load vars for managed_node1 15627 1726882474.13007: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882474.13008: Calling groups_plugins_play to load vars for managed_node1 15627 1726882474.13833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882474.14765: done with get_vars() 15627 1726882474.14779: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:34:34 -0400 (0:00:00.030) 0:00:13.900 ****** 15627 1726882474.14846: entering _queue_task() for managed_node1/ping 15627 1726882474.14848: Creating lock for ping 15627 1726882474.15047: worker is 1 (out of 1 available) 15627 1726882474.15060: exiting _queue_task() for managed_node1/ping 15627 1726882474.15073: done queuing things up, now waiting for results queue to drain 15627 1726882474.15074: waiting for pending results... 15627 1726882474.15252: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15627 1726882474.15325: in run() - task 0e448fcc-3ce9-2847-7723-000000000029 15627 1726882474.15336: variable 'ansible_search_path' from source: unknown 15627 1726882474.15339: variable 'ansible_search_path' from source: unknown 15627 1726882474.15371: calling self._execute() 15627 1726882474.15438: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.15442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.15451: variable 'omit' from source: magic vars 15627 1726882474.15726: variable 'ansible_distribution_major_version' from source: facts 15627 1726882474.15736: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882474.15741: variable 'omit' from source: magic vars 15627 1726882474.15772: variable 'omit' from source: magic vars 15627 1726882474.15794: variable 'omit' from source: magic vars 15627 1726882474.15827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882474.15854: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882474.15874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882474.15887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882474.15897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882474.15922: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882474.15925: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.15928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.15997: Set connection var ansible_timeout to 10 15627 1726882474.16003: Set connection var ansible_shell_executable to /bin/sh 15627 1726882474.16008: Set connection var ansible_connection to ssh 15627 1726882474.16013: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882474.16022: Set connection var ansible_pipelining to False 15627 1726882474.16025: Set connection var ansible_shell_type to sh 15627 1726882474.16044: variable 'ansible_shell_executable' from source: unknown 15627 1726882474.16047: variable 'ansible_connection' from source: unknown 15627 1726882474.16050: variable 'ansible_module_compression' from source: unknown 15627 1726882474.16054: variable 'ansible_shell_type' from source: unknown 15627 1726882474.16056: variable 'ansible_shell_executable' from source: unknown 15627 1726882474.16059: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.16061: variable 'ansible_pipelining' from source: unknown 15627 1726882474.16064: variable 'ansible_timeout' from source: unknown 15627 1726882474.16067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.16212: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882474.16221: variable 'omit' from source: magic vars 15627 1726882474.16224: starting attempt loop 15627 1726882474.16227: running the handler 15627 1726882474.16239: _low_level_execute_command(): starting 15627 1726882474.16246: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882474.16774: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.16784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.16816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.16826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.16839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882474.16846: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.16897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882474.16910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882474.16921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882474.17031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882474.18701: stdout chunk (state=3): >>>/root <<< 15627 1726882474.18804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882474.18855: stderr chunk (state=3): >>><<< 15627 1726882474.18866: stdout chunk (state=3): >>><<< 15627 1726882474.18889: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882474.18899: _low_level_execute_command(): starting 15627 1726882474.18905: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882474.1888735-16276-178839133512510 `" && echo ansible-tmp-1726882474.1888735-16276-178839133512510="` echo /root/.ansible/tmp/ansible-tmp-1726882474.1888735-16276-178839133512510 `" ) && sleep 0' 15627 1726882474.19375: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882474.19386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.19413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.19434: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.19491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882474.19502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882474.19606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882474.21480: stdout chunk (state=3): >>>ansible-tmp-1726882474.1888735-16276-178839133512510=/root/.ansible/tmp/ansible-tmp-1726882474.1888735-16276-178839133512510 <<< 15627 1726882474.21593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882474.21644: stderr chunk (state=3): >>><<< 15627 1726882474.21647: stdout chunk (state=3): >>><<< 15627 1726882474.21665: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882474.1888735-16276-178839133512510=/root/.ansible/tmp/ansible-tmp-1726882474.1888735-16276-178839133512510 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882474.21704: variable 'ansible_module_compression' from source: unknown 15627 1726882474.21738: ANSIBALLZ: Using lock for ping 15627 1726882474.21744: ANSIBALLZ: Acquiring lock 15627 1726882474.21747: ANSIBALLZ: Lock acquired: 140251851986448 15627 1726882474.21750: ANSIBALLZ: Creating module 15627 1726882474.29788: ANSIBALLZ: Writing module into payload 15627 1726882474.29831: ANSIBALLZ: Writing module 15627 1726882474.29847: ANSIBALLZ: Renaming module 15627 1726882474.29853: ANSIBALLZ: Done creating module 15627 1726882474.29873: variable 'ansible_facts' from source: unknown 15627 1726882474.29918: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882474.1888735-16276-178839133512510/AnsiballZ_ping.py 15627 1726882474.30032: Sending initial data 15627 1726882474.30035: Sent initial data (153 bytes) 15627 1726882474.30940: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882474.30944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.30989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.30992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.30995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.31055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882474.31060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882474.31062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882474.31223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882474.33024: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882474.33118: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882474.33213: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmp7gg3atar /root/.ansible/tmp/ansible-tmp-1726882474.1888735-16276-178839133512510/AnsiballZ_ping.py <<< 15627 1726882474.33303: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882474.34382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882474.34457: stderr chunk (state=3): >>><<< 15627 1726882474.34463: stdout chunk (state=3): >>><<< 15627 1726882474.34481: done transferring module to remote 15627 1726882474.34492: _low_level_execute_command(): starting 15627 1726882474.34495: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882474.1888735-16276-178839133512510/ /root/.ansible/tmp/ansible-tmp-1726882474.1888735-16276-178839133512510/AnsiballZ_ping.py && sleep 0' 15627 1726882474.34936: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882474.34942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.34956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.35001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.35004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882474.35007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.35009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.35060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882474.35067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882474.35174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882474.36944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882474.37004: stderr chunk (state=3): >>><<< 15627 1726882474.37007: stdout chunk (state=3): >>><<< 15627 1726882474.37044: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882474.37047: _low_level_execute_command(): starting 15627 1726882474.37049: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882474.1888735-16276-178839133512510/AnsiballZ_ping.py && sleep 0' 15627 1726882474.37726: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882474.37744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882474.37761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.37783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.37832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882474.37846: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882474.37867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.37885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882474.37897: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882474.37910: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882474.37924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882474.37937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.37955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.37974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882474.37986: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882474.38001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.38085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882474.38107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882474.38124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882474.38257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882474.51208: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15627 1726882474.52204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882474.52208: stdout chunk (state=3): >>><<< 15627 1726882474.52211: stderr chunk (state=3): >>><<< 15627 1726882474.52338: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882474.52343: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882474.1888735-16276-178839133512510/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882474.52346: _low_level_execute_command(): starting 15627 1726882474.52348: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882474.1888735-16276-178839133512510/ > /dev/null 2>&1 && sleep 0' 15627 1726882474.52987: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882474.53004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882474.53030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.53049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.53095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882474.53106: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882474.53127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.53150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882474.53166: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882474.53178: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882474.53189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882474.53206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.53222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.53247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882474.53260: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882474.53276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.53361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882474.53387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882474.53404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882474.53527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882474.55350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882474.55422: stderr chunk (state=3): >>><<< 15627 1726882474.55433: stdout chunk (state=3): >>><<< 15627 1726882474.55469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882474.55576: handler run complete 15627 1726882474.55580: attempt loop complete, returning result 15627 1726882474.55582: _execute() done 15627 1726882474.55584: dumping result to json 15627 1726882474.55586: done dumping result, returning 15627 1726882474.55588: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-2847-7723-000000000029] 15627 1726882474.55590: sending task result for task 0e448fcc-3ce9-2847-7723-000000000029 15627 1726882474.55655: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000029 15627 1726882474.55658: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 15627 1726882474.55924: no more pending results, returning what we have 15627 1726882474.55927: results queue empty 15627 1726882474.55928: checking for any_errors_fatal 15627 1726882474.55933: done checking for any_errors_fatal 15627 1726882474.55934: checking for max_fail_percentage 15627 1726882474.55935: done checking for max_fail_percentage 15627 1726882474.55936: checking to see if all hosts have failed and the running result is not ok 15627 1726882474.55937: done checking to see if all hosts have failed 15627 1726882474.55938: getting the remaining hosts for this loop 15627 1726882474.55939: done getting the remaining hosts for this loop 15627 1726882474.55943: getting the next task for host managed_node1 15627 1726882474.55951: done getting next task for host managed_node1 15627 1726882474.55953: ^ task is: TASK: meta (role_complete) 15627 1726882474.55955: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882474.55967: getting variables 15627 1726882474.55969: in VariableManager get_vars() 15627 1726882474.56009: Calling all_inventory to load vars for managed_node1 15627 1726882474.56012: Calling groups_inventory to load vars for managed_node1 15627 1726882474.56015: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882474.56025: Calling all_plugins_play to load vars for managed_node1 15627 1726882474.56028: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882474.56031: Calling groups_plugins_play to load vars for managed_node1 15627 1726882474.57663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882474.61219: done with get_vars() 15627 1726882474.61244: done getting variables 15627 1726882474.61447: done queuing things up, now waiting for results queue to drain 15627 1726882474.61449: results queue empty 15627 1726882474.61450: checking for any_errors_fatal 15627 1726882474.61453: done checking for any_errors_fatal 15627 1726882474.61454: checking for max_fail_percentage 15627 1726882474.61455: done checking for max_fail_percentage 15627 1726882474.61455: checking to see if all hosts have failed and the running result is not ok 15627 1726882474.61456: done checking to see if all hosts have failed 15627 1726882474.61457: getting the remaining hosts for this loop 15627 1726882474.61458: done getting the remaining hosts for this loop 15627 1726882474.61460: getting the next task for host managed_node1 15627 1726882474.61466: done getting next task for host managed_node1 15627 1726882474.61468: ^ task is: TASK: meta (flush_handlers) 15627 1726882474.61469: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882474.61472: getting variables 15627 1726882474.61473: in VariableManager get_vars() 15627 1726882474.61579: Calling all_inventory to load vars for managed_node1 15627 1726882474.61588: Calling groups_inventory to load vars for managed_node1 15627 1726882474.61590: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882474.61601: Calling all_plugins_play to load vars for managed_node1 15627 1726882474.61603: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882474.61606: Calling groups_plugins_play to load vars for managed_node1 15627 1726882474.63776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882474.65736: done with get_vars() 15627 1726882474.65768: done getting variables 15627 1726882474.65819: in VariableManager get_vars() 15627 1726882474.65831: Calling all_inventory to load vars for managed_node1 15627 1726882474.65834: Calling groups_inventory to load vars for managed_node1 15627 1726882474.65836: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882474.65841: Calling all_plugins_play to load vars for managed_node1 15627 1726882474.65843: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882474.65846: Calling groups_plugins_play to load vars for managed_node1 15627 1726882474.67296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882474.69101: done with get_vars() 15627 1726882474.69137: done queuing things up, now waiting for results queue to drain 15627 1726882474.69139: results queue empty 15627 1726882474.69140: checking for any_errors_fatal 15627 1726882474.69141: done checking for any_errors_fatal 15627 1726882474.69146: checking for max_fail_percentage 15627 1726882474.69147: done checking for max_fail_percentage 15627 1726882474.69148: checking to see if all hosts have failed and the running result is not ok 15627 1726882474.69149: done checking to see if all hosts have failed 15627 1726882474.69150: getting the remaining hosts for this loop 15627 1726882474.69151: done getting the remaining hosts for this loop 15627 1726882474.69153: getting the next task for host managed_node1 15627 1726882474.69157: done getting next task for host managed_node1 15627 1726882474.69159: ^ task is: TASK: meta (flush_handlers) 15627 1726882474.69160: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882474.69163: getting variables 15627 1726882474.69166: in VariableManager get_vars() 15627 1726882474.69177: Calling all_inventory to load vars for managed_node1 15627 1726882474.69179: Calling groups_inventory to load vars for managed_node1 15627 1726882474.69181: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882474.69186: Calling all_plugins_play to load vars for managed_node1 15627 1726882474.69188: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882474.69190: Calling groups_plugins_play to load vars for managed_node1 15627 1726882474.70469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882474.72233: done with get_vars() 15627 1726882474.72258: done getting variables 15627 1726882474.72313: in VariableManager get_vars() 15627 1726882474.72325: Calling all_inventory to load vars for managed_node1 15627 1726882474.72327: Calling groups_inventory to load vars for managed_node1 15627 1726882474.72329: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882474.72333: Calling all_plugins_play to load vars for managed_node1 15627 1726882474.72335: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882474.72338: Calling groups_plugins_play to load vars for managed_node1 15627 1726882474.73594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882474.74527: done with get_vars() 15627 1726882474.74547: done queuing things up, now waiting for results queue to drain 15627 1726882474.74549: results queue empty 15627 1726882474.74550: checking for any_errors_fatal 15627 1726882474.74551: done checking for any_errors_fatal 15627 1726882474.74551: checking for max_fail_percentage 15627 1726882474.74552: done checking for max_fail_percentage 15627 1726882474.74553: checking to see if all hosts have failed and the running result is not ok 15627 1726882474.74554: done checking to see if all hosts have failed 15627 1726882474.74555: getting the remaining hosts for this loop 15627 1726882474.74555: done getting the remaining hosts for this loop 15627 1726882474.74557: getting the next task for host managed_node1 15627 1726882474.74560: done getting next task for host managed_node1 15627 1726882474.74560: ^ task is: None 15627 1726882474.74561: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882474.74562: done queuing things up, now waiting for results queue to drain 15627 1726882474.74563: results queue empty 15627 1726882474.74564: checking for any_errors_fatal 15627 1726882474.74565: done checking for any_errors_fatal 15627 1726882474.74565: checking for max_fail_percentage 15627 1726882474.74566: done checking for max_fail_percentage 15627 1726882474.74566: checking to see if all hosts have failed and the running result is not ok 15627 1726882474.74567: done checking to see if all hosts have failed 15627 1726882474.74568: getting the next task for host managed_node1 15627 1726882474.74569: done getting next task for host managed_node1 15627 1726882474.74570: ^ task is: None 15627 1726882474.74570: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882474.74618: in VariableManager get_vars() 15627 1726882474.74630: done with get_vars() 15627 1726882474.74634: in VariableManager get_vars() 15627 1726882474.74641: done with get_vars() 15627 1726882474.74644: variable 'omit' from source: magic vars 15627 1726882474.74740: variable 'task' from source: play vars 15627 1726882474.74767: in VariableManager get_vars() 15627 1726882474.74776: done with get_vars() 15627 1726882474.74789: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_present.yml] ************************ 15627 1726882474.74908: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15627 1726882474.74928: getting the remaining hosts for this loop 15627 1726882474.74929: done getting the remaining hosts for this loop 15627 1726882474.74931: getting the next task for host managed_node1 15627 1726882474.74933: done getting next task for host managed_node1 15627 1726882474.74934: ^ task is: TASK: Gathering Facts 15627 1726882474.74935: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882474.74936: getting variables 15627 1726882474.74937: in VariableManager get_vars() 15627 1726882474.74943: Calling all_inventory to load vars for managed_node1 15627 1726882474.74944: Calling groups_inventory to load vars for managed_node1 15627 1726882474.74946: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882474.74949: Calling all_plugins_play to load vars for managed_node1 15627 1726882474.74950: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882474.74952: Calling groups_plugins_play to load vars for managed_node1 15627 1726882474.75753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882474.80798: done with get_vars() 15627 1726882474.80815: done getting variables 15627 1726882474.80845: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 21:34:34 -0400 (0:00:00.660) 0:00:14.560 ****** 15627 1726882474.80866: entering _queue_task() for managed_node1/gather_facts 15627 1726882474.81088: worker is 1 (out of 1 available) 15627 1726882474.81099: exiting _queue_task() for managed_node1/gather_facts 15627 1726882474.81111: done queuing things up, now waiting for results queue to drain 15627 1726882474.81113: waiting for pending results... 15627 1726882474.81291: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15627 1726882474.81369: in run() - task 0e448fcc-3ce9-2847-7723-000000000219 15627 1726882474.81384: variable 'ansible_search_path' from source: unknown 15627 1726882474.81416: calling self._execute() 15627 1726882474.81487: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.81492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.81501: variable 'omit' from source: magic vars 15627 1726882474.81773: variable 'ansible_distribution_major_version' from source: facts 15627 1726882474.81783: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882474.81788: variable 'omit' from source: magic vars 15627 1726882474.81808: variable 'omit' from source: magic vars 15627 1726882474.81832: variable 'omit' from source: magic vars 15627 1726882474.82108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882474.82114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882474.82117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882474.82119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882474.82121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882474.82123: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882474.82125: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.82127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.82129: Set connection var ansible_timeout to 10 15627 1726882474.82138: Set connection var ansible_shell_executable to /bin/sh 15627 1726882474.82147: Set connection var ansible_connection to ssh 15627 1726882474.82168: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882474.82180: Set connection var ansible_pipelining to False 15627 1726882474.82192: Set connection var ansible_shell_type to sh 15627 1726882474.82219: variable 'ansible_shell_executable' from source: unknown 15627 1726882474.82227: variable 'ansible_connection' from source: unknown 15627 1726882474.82234: variable 'ansible_module_compression' from source: unknown 15627 1726882474.82242: variable 'ansible_shell_type' from source: unknown 15627 1726882474.82249: variable 'ansible_shell_executable' from source: unknown 15627 1726882474.82259: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882474.82277: variable 'ansible_pipelining' from source: unknown 15627 1726882474.82284: variable 'ansible_timeout' from source: unknown 15627 1726882474.82293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882474.82484: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882474.82503: variable 'omit' from source: magic vars 15627 1726882474.82513: starting attempt loop 15627 1726882474.82523: running the handler 15627 1726882474.82543: variable 'ansible_facts' from source: unknown 15627 1726882474.82570: _low_level_execute_command(): starting 15627 1726882474.82583: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882474.83286: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882474.83301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.83323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882474.83340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.83388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882474.83401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882474.83411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882474.83517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882474.85190: stdout chunk (state=3): >>>/root <<< 15627 1726882474.85300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882474.85387: stderr chunk (state=3): >>><<< 15627 1726882474.85393: stdout chunk (state=3): >>><<< 15627 1726882474.85431: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882474.85464: _low_level_execute_command(): starting 15627 1726882474.85485: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882474.8543787-16299-271920306765230 `" && echo ansible-tmp-1726882474.8543787-16299-271920306765230="` echo /root/.ansible/tmp/ansible-tmp-1726882474.8543787-16299-271920306765230 `" ) && sleep 0' 15627 1726882474.86211: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882474.86229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882474.86233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.86259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882474.86272: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882474.86288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.86304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882474.86311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882474.86319: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882474.86339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882474.86342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882474.86345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.86402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882474.86405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882474.86514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882474.88383: stdout chunk (state=3): >>>ansible-tmp-1726882474.8543787-16299-271920306765230=/root/.ansible/tmp/ansible-tmp-1726882474.8543787-16299-271920306765230 <<< 15627 1726882474.88512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882474.88594: stderr chunk (state=3): >>><<< 15627 1726882474.88612: stdout chunk (state=3): >>><<< 15627 1726882474.88799: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882474.8543787-16299-271920306765230=/root/.ansible/tmp/ansible-tmp-1726882474.8543787-16299-271920306765230 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882474.88803: variable 'ansible_module_compression' from source: unknown 15627 1726882474.88805: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15627 1726882474.88833: variable 'ansible_facts' from source: unknown 15627 1726882474.88947: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882474.8543787-16299-271920306765230/AnsiballZ_setup.py 15627 1726882474.89125: Sending initial data 15627 1726882474.89128: Sent initial data (154 bytes) 15627 1726882474.89911: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882474.89916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.89952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.89955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.89958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.90012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882474.90016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882474.90020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882474.90115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882474.91837: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 15627 1726882474.91840: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882474.91925: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882474.92044: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmp3h6civfx /root/.ansible/tmp/ansible-tmp-1726882474.8543787-16299-271920306765230/AnsiballZ_setup.py <<< 15627 1726882474.92127: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882474.94686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882474.94800: stderr chunk (state=3): >>><<< 15627 1726882474.94804: stdout chunk (state=3): >>><<< 15627 1726882474.94821: done transferring module to remote 15627 1726882474.94830: _low_level_execute_command(): starting 15627 1726882474.94835: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882474.8543787-16299-271920306765230/ /root/.ansible/tmp/ansible-tmp-1726882474.8543787-16299-271920306765230/AnsiballZ_setup.py && sleep 0' 15627 1726882474.95309: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882474.95312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.95333: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882474.95340: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882474.95370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.95374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.95376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.95422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882474.95426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882474.95527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882474.97395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882474.97450: stderr chunk (state=3): >>><<< 15627 1726882474.97460: stdout chunk (state=3): >>><<< 15627 1726882474.97481: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882474.97484: _low_level_execute_command(): starting 15627 1726882474.97486: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882474.8543787-16299-271920306765230/AnsiballZ_setup.py && sleep 0' 15627 1726882474.98004: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882474.98009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.98011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882474.98078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.98096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882474.98107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882474.98144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882474.98149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882474.98268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882475.50130: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["LSR-TST-br31", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "e6:c3:a0:67:8a:2a", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_<<< 15627 1726882475.50171: stdout chunk (state=3): >>>hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_loadavg": {"1m": 0.46, "5m": 0.37, "15m": 0.19}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "an<<< 15627 1726882475.50194: stdout chunk (state=3): >>>sible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "35", "epoch": "1726882475", "epoch_int": "1726882475", "date": "2024-09-20", "time": "21:34:35", "iso8601_micro": "2024-09-21T01:34:35.279138Z", "iso8601": "2024-09-21T01:34:35Z", "iso8601_basic": "20240920T213435279138", "iso8601_basic_short": "20240920T213435", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2820, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 712, "free": 2820}, "nocache": {"free": 3281, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 633, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241389568, "block_size": 4096, "block_total": 65519355, "block_available": 64512058, "block_used": 1007297, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15627 1726882475.51872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882475.51876: stdout chunk (state=3): >>><<< 15627 1726882475.51879: stderr chunk (state=3): >>><<< 15627 1726882475.52171: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["LSR-TST-br31", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "e6:c3:a0:67:8a:2a", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_loadavg": {"1m": 0.46, "5m": 0.37, "15m": 0.19}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "35", "epoch": "1726882475", "epoch_int": "1726882475", "date": "2024-09-20", "time": "21:34:35", "iso8601_micro": "2024-09-21T01:34:35.279138Z", "iso8601": "2024-09-21T01:34:35Z", "iso8601_basic": "20240920T213435279138", "iso8601_basic_short": "20240920T213435", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2820, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 712, "free": 2820}, "nocache": {"free": 3281, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 633, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241389568, "block_size": 4096, "block_total": 65519355, "block_available": 64512058, "block_used": 1007297, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882475.52358: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882474.8543787-16299-271920306765230/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882475.52385: _low_level_execute_command(): starting 15627 1726882475.52394: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882474.8543787-16299-271920306765230/ > /dev/null 2>&1 && sleep 0' 15627 1726882475.53087: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882475.53101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882475.53118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882475.53135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882475.53187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882475.53199: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882475.53212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882475.53228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882475.53239: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882475.53249: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882475.53260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882475.53283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882475.53299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882475.53310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882475.53320: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882475.53332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882475.53421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882475.53444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882475.53460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882475.53587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882475.55480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882475.55510: stderr chunk (state=3): >>><<< 15627 1726882475.55514: stdout chunk (state=3): >>><<< 15627 1726882475.55678: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882475.55682: handler run complete 15627 1726882475.55684: variable 'ansible_facts' from source: unknown 15627 1726882475.55788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882475.56126: variable 'ansible_facts' from source: unknown 15627 1726882475.56220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882475.56371: attempt loop complete, returning result 15627 1726882475.56380: _execute() done 15627 1726882475.56386: dumping result to json 15627 1726882475.56423: done dumping result, returning 15627 1726882475.56435: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-2847-7723-000000000219] 15627 1726882475.56444: sending task result for task 0e448fcc-3ce9-2847-7723-000000000219 ok: [managed_node1] 15627 1726882475.57118: no more pending results, returning what we have 15627 1726882475.57121: results queue empty 15627 1726882475.57122: checking for any_errors_fatal 15627 1726882475.57123: done checking for any_errors_fatal 15627 1726882475.57124: checking for max_fail_percentage 15627 1726882475.57126: done checking for max_fail_percentage 15627 1726882475.57127: checking to see if all hosts have failed and the running result is not ok 15627 1726882475.57128: done checking to see if all hosts have failed 15627 1726882475.57129: getting the remaining hosts for this loop 15627 1726882475.57130: done getting the remaining hosts for this loop 15627 1726882475.57134: getting the next task for host managed_node1 15627 1726882475.57141: done getting next task for host managed_node1 15627 1726882475.57143: ^ task is: TASK: meta (flush_handlers) 15627 1726882475.57145: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882475.57150: getting variables 15627 1726882475.57152: in VariableManager get_vars() 15627 1726882475.57178: Calling all_inventory to load vars for managed_node1 15627 1726882475.57181: Calling groups_inventory to load vars for managed_node1 15627 1726882475.57184: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882475.57196: Calling all_plugins_play to load vars for managed_node1 15627 1726882475.57199: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882475.57202: Calling groups_plugins_play to load vars for managed_node1 15627 1726882475.58284: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000219 15627 1726882475.58287: WORKER PROCESS EXITING 15627 1726882475.58914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882475.60607: done with get_vars() 15627 1726882475.60631: done getting variables 15627 1726882475.60702: in VariableManager get_vars() 15627 1726882475.60713: Calling all_inventory to load vars for managed_node1 15627 1726882475.60715: Calling groups_inventory to load vars for managed_node1 15627 1726882475.60718: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882475.60723: Calling all_plugins_play to load vars for managed_node1 15627 1726882475.60725: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882475.60737: Calling groups_plugins_play to load vars for managed_node1 15627 1726882475.62041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882475.63729: done with get_vars() 15627 1726882475.63758: done queuing things up, now waiting for results queue to drain 15627 1726882475.63760: results queue empty 15627 1726882475.63761: checking for any_errors_fatal 15627 1726882475.63766: done checking for any_errors_fatal 15627 1726882475.63767: checking for max_fail_percentage 15627 1726882475.63768: done checking for max_fail_percentage 15627 1726882475.63769: checking to see if all hosts have failed and the running result is not ok 15627 1726882475.63770: done checking to see if all hosts have failed 15627 1726882475.63771: getting the remaining hosts for this loop 15627 1726882475.63772: done getting the remaining hosts for this loop 15627 1726882475.63774: getting the next task for host managed_node1 15627 1726882475.63778: done getting next task for host managed_node1 15627 1726882475.63781: ^ task is: TASK: Include the task '{{ task }}' 15627 1726882475.63782: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882475.63785: getting variables 15627 1726882475.63786: in VariableManager get_vars() 15627 1726882475.63795: Calling all_inventory to load vars for managed_node1 15627 1726882475.63797: Calling groups_inventory to load vars for managed_node1 15627 1726882475.63799: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882475.63805: Calling all_plugins_play to load vars for managed_node1 15627 1726882475.63807: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882475.63810: Calling groups_plugins_play to load vars for managed_node1 15627 1726882475.66032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882475.69424: done with get_vars() 15627 1726882475.69447: done getting variables 15627 1726882475.69615: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_present.yml'] ********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 21:34:35 -0400 (0:00:00.887) 0:00:15.448 ****** 15627 1726882475.69644: entering _queue_task() for managed_node1/include_tasks 15627 1726882475.70010: worker is 1 (out of 1 available) 15627 1726882475.70021: exiting _queue_task() for managed_node1/include_tasks 15627 1726882475.70035: done queuing things up, now waiting for results queue to drain 15627 1726882475.70036: waiting for pending results... 15627 1726882475.70307: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_present.yml' 15627 1726882475.70416: in run() - task 0e448fcc-3ce9-2847-7723-00000000002d 15627 1726882475.70438: variable 'ansible_search_path' from source: unknown 15627 1726882475.70485: calling self._execute() 15627 1726882475.70569: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882475.70580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882475.70597: variable 'omit' from source: magic vars 15627 1726882475.70956: variable 'ansible_distribution_major_version' from source: facts 15627 1726882475.70975: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882475.70986: variable 'task' from source: play vars 15627 1726882475.71068: variable 'task' from source: play vars 15627 1726882475.71081: _execute() done 15627 1726882475.71089: dumping result to json 15627 1726882475.71096: done dumping result, returning 15627 1726882475.71104: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_present.yml' [0e448fcc-3ce9-2847-7723-00000000002d] 15627 1726882475.71114: sending task result for task 0e448fcc-3ce9-2847-7723-00000000002d 15627 1726882475.71348: no more pending results, returning what we have 15627 1726882475.71354: in VariableManager get_vars() 15627 1726882475.71441: Calling all_inventory to load vars for managed_node1 15627 1726882475.71444: Calling groups_inventory to load vars for managed_node1 15627 1726882475.71449: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882475.71465: Calling all_plugins_play to load vars for managed_node1 15627 1726882475.71469: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882475.71472: Calling groups_plugins_play to load vars for managed_node1 15627 1726882475.72598: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000002d 15627 1726882475.72622: WORKER PROCESS EXITING 15627 1726882475.75085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882475.79385: done with get_vars() 15627 1726882475.79411: variable 'ansible_search_path' from source: unknown 15627 1726882475.79427: we have included files to process 15627 1726882475.79428: generating all_blocks data 15627 1726882475.79430: done generating all_blocks data 15627 1726882475.79431: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15627 1726882475.79431: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15627 1726882475.79434: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15627 1726882475.79611: in VariableManager get_vars() 15627 1726882475.79627: done with get_vars() 15627 1726882475.79741: done processing included file 15627 1726882475.79744: iterating over new_blocks loaded from include file 15627 1726882475.79745: in VariableManager get_vars() 15627 1726882475.79760: done with get_vars() 15627 1726882475.79762: filtering new block on tags 15627 1726882475.79781: done filtering new block on tags 15627 1726882475.79783: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 15627 1726882475.79789: extending task lists for all hosts with included blocks 15627 1726882475.79820: done extending task lists 15627 1726882475.79821: done processing included files 15627 1726882475.79822: results queue empty 15627 1726882475.79822: checking for any_errors_fatal 15627 1726882475.79824: done checking for any_errors_fatal 15627 1726882475.79825: checking for max_fail_percentage 15627 1726882475.79840: done checking for max_fail_percentage 15627 1726882475.79841: checking to see if all hosts have failed and the running result is not ok 15627 1726882475.79842: done checking to see if all hosts have failed 15627 1726882475.79843: getting the remaining hosts for this loop 15627 1726882475.79844: done getting the remaining hosts for this loop 15627 1726882475.79847: getting the next task for host managed_node1 15627 1726882475.79851: done getting next task for host managed_node1 15627 1726882475.79856: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15627 1726882475.79859: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882475.79861: getting variables 15627 1726882475.79862: in VariableManager get_vars() 15627 1726882475.79873: Calling all_inventory to load vars for managed_node1 15627 1726882475.79875: Calling groups_inventory to load vars for managed_node1 15627 1726882475.79878: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882475.79883: Calling all_plugins_play to load vars for managed_node1 15627 1726882475.79885: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882475.79889: Calling groups_plugins_play to load vars for managed_node1 15627 1726882475.81412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882475.83189: done with get_vars() 15627 1726882475.83219: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:34:35 -0400 (0:00:00.136) 0:00:15.584 ****** 15627 1726882475.83304: entering _queue_task() for managed_node1/include_tasks 15627 1726882475.83640: worker is 1 (out of 1 available) 15627 1726882475.83652: exiting _queue_task() for managed_node1/include_tasks 15627 1726882475.83669: done queuing things up, now waiting for results queue to drain 15627 1726882475.83671: waiting for pending results... 15627 1726882475.83948: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 15627 1726882475.84120: in run() - task 0e448fcc-3ce9-2847-7723-00000000022a 15627 1726882475.84138: variable 'ansible_search_path' from source: unknown 15627 1726882475.84145: variable 'ansible_search_path' from source: unknown 15627 1726882475.84196: calling self._execute() 15627 1726882475.84289: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882475.84300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882475.84312: variable 'omit' from source: magic vars 15627 1726882475.84679: variable 'ansible_distribution_major_version' from source: facts 15627 1726882475.84696: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882475.84706: _execute() done 15627 1726882475.84712: dumping result to json 15627 1726882475.84719: done dumping result, returning 15627 1726882475.84727: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-2847-7723-00000000022a] 15627 1726882475.84736: sending task result for task 0e448fcc-3ce9-2847-7723-00000000022a 15627 1726882475.84844: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000022a 15627 1726882475.84851: WORKER PROCESS EXITING 15627 1726882475.84891: no more pending results, returning what we have 15627 1726882475.84896: in VariableManager get_vars() 15627 1726882475.84928: Calling all_inventory to load vars for managed_node1 15627 1726882475.84932: Calling groups_inventory to load vars for managed_node1 15627 1726882475.84935: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882475.84949: Calling all_plugins_play to load vars for managed_node1 15627 1726882475.84953: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882475.84959: Calling groups_plugins_play to load vars for managed_node1 15627 1726882475.86643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882475.88493: done with get_vars() 15627 1726882475.88510: variable 'ansible_search_path' from source: unknown 15627 1726882475.88512: variable 'ansible_search_path' from source: unknown 15627 1726882475.88521: variable 'task' from source: play vars 15627 1726882475.88634: variable 'task' from source: play vars 15627 1726882475.88675: we have included files to process 15627 1726882475.88677: generating all_blocks data 15627 1726882475.88679: done generating all_blocks data 15627 1726882475.88680: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15627 1726882475.88681: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15627 1726882475.88683: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15627 1726882475.88866: done processing included file 15627 1726882475.88868: iterating over new_blocks loaded from include file 15627 1726882475.88870: in VariableManager get_vars() 15627 1726882475.88884: done with get_vars() 15627 1726882475.88885: filtering new block on tags 15627 1726882475.88901: done filtering new block on tags 15627 1726882475.88903: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 15627 1726882475.88908: extending task lists for all hosts with included blocks 15627 1726882475.89007: done extending task lists 15627 1726882475.89008: done processing included files 15627 1726882475.89009: results queue empty 15627 1726882475.89010: checking for any_errors_fatal 15627 1726882475.89014: done checking for any_errors_fatal 15627 1726882475.89014: checking for max_fail_percentage 15627 1726882475.89016: done checking for max_fail_percentage 15627 1726882475.89016: checking to see if all hosts have failed and the running result is not ok 15627 1726882475.89017: done checking to see if all hosts have failed 15627 1726882475.89018: getting the remaining hosts for this loop 15627 1726882475.89019: done getting the remaining hosts for this loop 15627 1726882475.89022: getting the next task for host managed_node1 15627 1726882475.89026: done getting next task for host managed_node1 15627 1726882475.89028: ^ task is: TASK: Get stat for interface {{ interface }} 15627 1726882475.89031: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882475.89033: getting variables 15627 1726882475.89034: in VariableManager get_vars() 15627 1726882475.89043: Calling all_inventory to load vars for managed_node1 15627 1726882475.89046: Calling groups_inventory to load vars for managed_node1 15627 1726882475.89048: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882475.89056: Calling all_plugins_play to load vars for managed_node1 15627 1726882475.89058: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882475.89061: Calling groups_plugins_play to load vars for managed_node1 15627 1726882475.90365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882475.92060: done with get_vars() 15627 1726882475.92088: done getting variables 15627 1726882475.92227: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:34:35 -0400 (0:00:00.089) 0:00:15.674 ****** 15627 1726882475.92257: entering _queue_task() for managed_node1/stat 15627 1726882475.92596: worker is 1 (out of 1 available) 15627 1726882475.92608: exiting _queue_task() for managed_node1/stat 15627 1726882475.92621: done queuing things up, now waiting for results queue to drain 15627 1726882475.92622: waiting for pending results... 15627 1726882475.92901: running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 15627 1726882475.93016: in run() - task 0e448fcc-3ce9-2847-7723-000000000235 15627 1726882475.93038: variable 'ansible_search_path' from source: unknown 15627 1726882475.93046: variable 'ansible_search_path' from source: unknown 15627 1726882475.93096: calling self._execute() 15627 1726882475.93192: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882475.93202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882475.93216: variable 'omit' from source: magic vars 15627 1726882475.93700: variable 'ansible_distribution_major_version' from source: facts 15627 1726882475.93837: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882475.93849: variable 'omit' from source: magic vars 15627 1726882475.93905: variable 'omit' from source: magic vars 15627 1726882475.94121: variable 'interface' from source: set_fact 15627 1726882475.94148: variable 'omit' from source: magic vars 15627 1726882475.94298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882475.94334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882475.94365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882475.94389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882475.94488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882475.94526: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882475.94535: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882475.94543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882475.94798: Set connection var ansible_timeout to 10 15627 1726882475.94815: Set connection var ansible_shell_executable to /bin/sh 15627 1726882475.94825: Set connection var ansible_connection to ssh 15627 1726882475.94834: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882475.94843: Set connection var ansible_pipelining to False 15627 1726882475.94911: Set connection var ansible_shell_type to sh 15627 1726882475.94941: variable 'ansible_shell_executable' from source: unknown 15627 1726882475.94949: variable 'ansible_connection' from source: unknown 15627 1726882475.94959: variable 'ansible_module_compression' from source: unknown 15627 1726882475.94968: variable 'ansible_shell_type' from source: unknown 15627 1726882475.94975: variable 'ansible_shell_executable' from source: unknown 15627 1726882475.94982: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882475.95021: variable 'ansible_pipelining' from source: unknown 15627 1726882475.95029: variable 'ansible_timeout' from source: unknown 15627 1726882475.95037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882475.95493: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882475.95575: variable 'omit' from source: magic vars 15627 1726882475.95586: starting attempt loop 15627 1726882475.95593: running the handler 15627 1726882475.95612: _low_level_execute_command(): starting 15627 1726882475.95625: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882475.97490: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882475.97509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882475.97553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882475.97595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882475.97638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882475.97657: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882475.97676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882475.97697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882475.97710: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882475.97723: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882475.97737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882475.97752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882475.97778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882475.97791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882475.97801: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882475.97813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882475.97893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882475.97908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882475.97922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882475.98070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882475.99739: stdout chunk (state=3): >>>/root <<< 15627 1726882475.99852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882476.00210: stderr chunk (state=3): >>><<< 15627 1726882476.00224: stdout chunk (state=3): >>><<< 15627 1726882476.00348: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882476.00352: _low_level_execute_command(): starting 15627 1726882476.00355: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882476.002568-16324-280237228815768 `" && echo ansible-tmp-1726882476.002568-16324-280237228815768="` echo /root/.ansible/tmp/ansible-tmp-1726882476.002568-16324-280237228815768 `" ) && sleep 0' 15627 1726882476.01706: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882476.01730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.01748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.01768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.01821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.01839: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882476.01854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.01875: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882476.01887: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882476.01907: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882476.01919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.01930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.01945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.01957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.01972: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882476.01985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.02076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882476.02098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882476.02118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882476.02261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882476.04145: stdout chunk (state=3): >>>ansible-tmp-1726882476.002568-16324-280237228815768=/root/.ansible/tmp/ansible-tmp-1726882476.002568-16324-280237228815768 <<< 15627 1726882476.04286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882476.04385: stderr chunk (state=3): >>><<< 15627 1726882476.04388: stdout chunk (state=3): >>><<< 15627 1726882476.04491: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882476.002568-16324-280237228815768=/root/.ansible/tmp/ansible-tmp-1726882476.002568-16324-280237228815768 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882476.04573: variable 'ansible_module_compression' from source: unknown 15627 1726882476.04751: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15627 1726882476.04755: variable 'ansible_facts' from source: unknown 15627 1726882476.04811: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882476.002568-16324-280237228815768/AnsiballZ_stat.py 15627 1726882476.06339: Sending initial data 15627 1726882476.06343: Sent initial data (152 bytes) 15627 1726882476.08472: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882476.08493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.08509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.08529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.08596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.08609: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882476.08625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.08647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882476.08667: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882476.08682: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882476.08697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.08779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.08804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.08819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.08832: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882476.08846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.09001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882476.09032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882476.09050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882476.09180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882476.10971: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 15627 1726882476.10976: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882476.11057: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882476.11155: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpmwe9xatw /root/.ansible/tmp/ansible-tmp-1726882476.002568-16324-280237228815768/AnsiballZ_stat.py <<< 15627 1726882476.11240: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882476.12810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882476.13084: stderr chunk (state=3): >>><<< 15627 1726882476.13087: stdout chunk (state=3): >>><<< 15627 1726882476.13089: done transferring module to remote 15627 1726882476.13095: _low_level_execute_command(): starting 15627 1726882476.13098: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882476.002568-16324-280237228815768/ /root/.ansible/tmp/ansible-tmp-1726882476.002568-16324-280237228815768/AnsiballZ_stat.py && sleep 0' 15627 1726882476.14662: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882476.14680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.14695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.14714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.14770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.14784: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882476.14798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.14816: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882476.14827: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882476.14838: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882476.14858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.14880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.14897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.14909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.14920: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882476.14982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.15060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882476.15194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882476.15213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882476.15405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882476.17184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882476.17268: stderr chunk (state=3): >>><<< 15627 1726882476.17271: stdout chunk (state=3): >>><<< 15627 1726882476.17362: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882476.17370: _low_level_execute_command(): starting 15627 1726882476.17373: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882476.002568-16324-280237228815768/AnsiballZ_stat.py && sleep 0' 15627 1726882476.18829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882476.18858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.18878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.18905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.18948: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.18973: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882476.18993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.19019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882476.19032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882476.19043: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882476.19058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.19078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.19095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.19115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.19131: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882476.19146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.19236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882476.19258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882476.19276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882476.19450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882476.32654: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26643, "dev": 21, "nlink": 1, "atime": 1726882473.89121, "mtime": 1726882473.89121, "ctime": 1726882473.89121, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15627 1726882476.33748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882476.33753: stdout chunk (state=3): >>><<< 15627 1726882476.33755: stderr chunk (state=3): >>><<< 15627 1726882476.33862: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26643, "dev": 21, "nlink": 1, "atime": 1726882473.89121, "mtime": 1726882473.89121, "ctime": 1726882473.89121, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882476.33875: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882476.002568-16324-280237228815768/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882476.33878: _low_level_execute_command(): starting 15627 1726882476.33880: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882476.002568-16324-280237228815768/ > /dev/null 2>&1 && sleep 0' 15627 1726882476.35712: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882476.35727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.35742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.35760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.35882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.35901: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882476.35921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.35938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882476.35949: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882476.35958: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882476.35971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.35983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.35996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.36013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.36026: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882476.36039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.36117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882476.36251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882476.36268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882476.36469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882476.38375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882476.38378: stdout chunk (state=3): >>><<< 15627 1726882476.38381: stderr chunk (state=3): >>><<< 15627 1726882476.38574: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882476.38578: handler run complete 15627 1726882476.38580: attempt loop complete, returning result 15627 1726882476.38583: _execute() done 15627 1726882476.38585: dumping result to json 15627 1726882476.38587: done dumping result, returning 15627 1726882476.38589: done running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 [0e448fcc-3ce9-2847-7723-000000000235] 15627 1726882476.38591: sending task result for task 0e448fcc-3ce9-2847-7723-000000000235 15627 1726882476.38668: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000235 15627 1726882476.38672: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882473.89121, "block_size": 4096, "blocks": 0, "ctime": 1726882473.89121, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26643, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "mode": "0777", "mtime": 1726882473.89121, "nlink": 1, "path": "/sys/class/net/LSR-TST-br31", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 15627 1726882476.38777: no more pending results, returning what we have 15627 1726882476.38780: results queue empty 15627 1726882476.38781: checking for any_errors_fatal 15627 1726882476.38784: done checking for any_errors_fatal 15627 1726882476.38785: checking for max_fail_percentage 15627 1726882476.38787: done checking for max_fail_percentage 15627 1726882476.38788: checking to see if all hosts have failed and the running result is not ok 15627 1726882476.38790: done checking to see if all hosts have failed 15627 1726882476.38790: getting the remaining hosts for this loop 15627 1726882476.38792: done getting the remaining hosts for this loop 15627 1726882476.38796: getting the next task for host managed_node1 15627 1726882476.38806: done getting next task for host managed_node1 15627 1726882476.38809: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 15627 1726882476.38813: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882476.38818: getting variables 15627 1726882476.38820: in VariableManager get_vars() 15627 1726882476.38848: Calling all_inventory to load vars for managed_node1 15627 1726882476.38855: Calling groups_inventory to load vars for managed_node1 15627 1726882476.38859: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882476.38871: Calling all_plugins_play to load vars for managed_node1 15627 1726882476.38875: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882476.38878: Calling groups_plugins_play to load vars for managed_node1 15627 1726882476.41316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882476.43989: done with get_vars() 15627 1726882476.44010: done getting variables 15627 1726882476.44186: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882476.44416: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'LSR-TST-br31'] ******************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:34:36 -0400 (0:00:00.521) 0:00:16.196 ****** 15627 1726882476.44446: entering _queue_task() for managed_node1/assert 15627 1726882476.45056: worker is 1 (out of 1 available) 15627 1726882476.45071: exiting _queue_task() for managed_node1/assert 15627 1726882476.45080: done queuing things up, now waiting for results queue to drain 15627 1726882476.45081: waiting for pending results... 15627 1726882476.45517: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'LSR-TST-br31' 15627 1726882476.45623: in run() - task 0e448fcc-3ce9-2847-7723-00000000022b 15627 1726882476.45640: variable 'ansible_search_path' from source: unknown 15627 1726882476.45647: variable 'ansible_search_path' from source: unknown 15627 1726882476.45687: calling self._execute() 15627 1726882476.45774: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882476.45785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882476.45832: variable 'omit' from source: magic vars 15627 1726882476.46177: variable 'ansible_distribution_major_version' from source: facts 15627 1726882476.46193: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882476.46205: variable 'omit' from source: magic vars 15627 1726882476.46243: variable 'omit' from source: magic vars 15627 1726882476.46342: variable 'interface' from source: set_fact 15627 1726882476.46368: variable 'omit' from source: magic vars 15627 1726882476.46408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882476.46445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882476.46476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882476.46497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882476.46513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882476.46546: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882476.46556: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882476.46566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882476.46669: Set connection var ansible_timeout to 10 15627 1726882476.46683: Set connection var ansible_shell_executable to /bin/sh 15627 1726882476.46692: Set connection var ansible_connection to ssh 15627 1726882476.46701: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882476.46709: Set connection var ansible_pipelining to False 15627 1726882476.46715: Set connection var ansible_shell_type to sh 15627 1726882476.46740: variable 'ansible_shell_executable' from source: unknown 15627 1726882476.46748: variable 'ansible_connection' from source: unknown 15627 1726882476.46757: variable 'ansible_module_compression' from source: unknown 15627 1726882476.46765: variable 'ansible_shell_type' from source: unknown 15627 1726882476.46772: variable 'ansible_shell_executable' from source: unknown 15627 1726882476.46778: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882476.46785: variable 'ansible_pipelining' from source: unknown 15627 1726882476.46791: variable 'ansible_timeout' from source: unknown 15627 1726882476.46797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882476.46931: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882476.46946: variable 'omit' from source: magic vars 15627 1726882476.46957: starting attempt loop 15627 1726882476.46966: running the handler 15627 1726882476.47097: variable 'interface_stat' from source: set_fact 15627 1726882476.47675: Evaluated conditional (interface_stat.stat.exists): True 15627 1726882476.47685: handler run complete 15627 1726882476.47702: attempt loop complete, returning result 15627 1726882476.47708: _execute() done 15627 1726882476.47714: dumping result to json 15627 1726882476.47720: done dumping result, returning 15627 1726882476.47731: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'LSR-TST-br31' [0e448fcc-3ce9-2847-7723-00000000022b] 15627 1726882476.47740: sending task result for task 0e448fcc-3ce9-2847-7723-00000000022b ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15627 1726882476.47899: no more pending results, returning what we have 15627 1726882476.47902: results queue empty 15627 1726882476.47903: checking for any_errors_fatal 15627 1726882476.47910: done checking for any_errors_fatal 15627 1726882476.47911: checking for max_fail_percentage 15627 1726882476.47913: done checking for max_fail_percentage 15627 1726882476.47913: checking to see if all hosts have failed and the running result is not ok 15627 1726882476.47915: done checking to see if all hosts have failed 15627 1726882476.47915: getting the remaining hosts for this loop 15627 1726882476.47917: done getting the remaining hosts for this loop 15627 1726882476.47920: getting the next task for host managed_node1 15627 1726882476.47929: done getting next task for host managed_node1 15627 1726882476.47932: ^ task is: TASK: meta (flush_handlers) 15627 1726882476.47933: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882476.47939: getting variables 15627 1726882476.47940: in VariableManager get_vars() 15627 1726882476.47968: Calling all_inventory to load vars for managed_node1 15627 1726882476.47970: Calling groups_inventory to load vars for managed_node1 15627 1726882476.47974: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882476.47985: Calling all_plugins_play to load vars for managed_node1 15627 1726882476.47988: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882476.47991: Calling groups_plugins_play to load vars for managed_node1 15627 1726882476.48512: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000022b 15627 1726882476.48515: WORKER PROCESS EXITING 15627 1726882476.49436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882476.53171: done with get_vars() 15627 1726882476.53194: done getting variables 15627 1726882476.53277: in VariableManager get_vars() 15627 1726882476.53288: Calling all_inventory to load vars for managed_node1 15627 1726882476.53290: Calling groups_inventory to load vars for managed_node1 15627 1726882476.53293: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882476.53298: Calling all_plugins_play to load vars for managed_node1 15627 1726882476.53300: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882476.53303: Calling groups_plugins_play to load vars for managed_node1 15627 1726882476.54898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882476.57328: done with get_vars() 15627 1726882476.57357: done queuing things up, now waiting for results queue to drain 15627 1726882476.57359: results queue empty 15627 1726882476.57360: checking for any_errors_fatal 15627 1726882476.57362: done checking for any_errors_fatal 15627 1726882476.57365: checking for max_fail_percentage 15627 1726882476.57366: done checking for max_fail_percentage 15627 1726882476.57367: checking to see if all hosts have failed and the running result is not ok 15627 1726882476.57367: done checking to see if all hosts have failed 15627 1726882476.57372: getting the remaining hosts for this loop 15627 1726882476.57373: done getting the remaining hosts for this loop 15627 1726882476.57376: getting the next task for host managed_node1 15627 1726882476.57379: done getting next task for host managed_node1 15627 1726882476.57381: ^ task is: TASK: meta (flush_handlers) 15627 1726882476.57382: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882476.57384: getting variables 15627 1726882476.57385: in VariableManager get_vars() 15627 1726882476.57393: Calling all_inventory to load vars for managed_node1 15627 1726882476.57395: Calling groups_inventory to load vars for managed_node1 15627 1726882476.57396: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882476.57401: Calling all_plugins_play to load vars for managed_node1 15627 1726882476.57403: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882476.57406: Calling groups_plugins_play to load vars for managed_node1 15627 1726882476.59876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882476.64986: done with get_vars() 15627 1726882476.65224: done getting variables 15627 1726882476.65358: in VariableManager get_vars() 15627 1726882476.65369: Calling all_inventory to load vars for managed_node1 15627 1726882476.65372: Calling groups_inventory to load vars for managed_node1 15627 1726882476.65374: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882476.65379: Calling all_plugins_play to load vars for managed_node1 15627 1726882476.65381: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882476.65383: Calling groups_plugins_play to load vars for managed_node1 15627 1726882476.68878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882476.74825: done with get_vars() 15627 1726882476.74863: done queuing things up, now waiting for results queue to drain 15627 1726882476.74982: results queue empty 15627 1726882476.74983: checking for any_errors_fatal 15627 1726882476.74985: done checking for any_errors_fatal 15627 1726882476.74986: checking for max_fail_percentage 15627 1726882476.74987: done checking for max_fail_percentage 15627 1726882476.74988: checking to see if all hosts have failed and the running result is not ok 15627 1726882476.74989: done checking to see if all hosts have failed 15627 1726882476.74989: getting the remaining hosts for this loop 15627 1726882476.74990: done getting the remaining hosts for this loop 15627 1726882476.74994: getting the next task for host managed_node1 15627 1726882476.74997: done getting next task for host managed_node1 15627 1726882476.74998: ^ task is: None 15627 1726882476.75000: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882476.75001: done queuing things up, now waiting for results queue to drain 15627 1726882476.75002: results queue empty 15627 1726882476.75003: checking for any_errors_fatal 15627 1726882476.75004: done checking for any_errors_fatal 15627 1726882476.75004: checking for max_fail_percentage 15627 1726882476.75005: done checking for max_fail_percentage 15627 1726882476.75006: checking to see if all hosts have failed and the running result is not ok 15627 1726882476.75007: done checking to see if all hosts have failed 15627 1726882476.75008: getting the next task for host managed_node1 15627 1726882476.75010: done getting next task for host managed_node1 15627 1726882476.75011: ^ task is: None 15627 1726882476.75012: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882476.75149: in VariableManager get_vars() 15627 1726882476.75170: done with get_vars() 15627 1726882476.75177: in VariableManager get_vars() 15627 1726882476.75286: done with get_vars() 15627 1726882476.75293: variable 'omit' from source: magic vars 15627 1726882476.75607: variable 'task' from source: play vars 15627 1726882476.75836: in VariableManager get_vars() 15627 1726882476.75846: done with get_vars() 15627 1726882476.75992: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_present.yml] *********************** 15627 1726882476.76608: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15627 1726882476.77206: getting the remaining hosts for this loop 15627 1726882476.77208: done getting the remaining hosts for this loop 15627 1726882476.77211: getting the next task for host managed_node1 15627 1726882476.77213: done getting next task for host managed_node1 15627 1726882476.77215: ^ task is: TASK: Gathering Facts 15627 1726882476.77217: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882476.77219: getting variables 15627 1726882476.77220: in VariableManager get_vars() 15627 1726882476.77247: Calling all_inventory to load vars for managed_node1 15627 1726882476.77250: Calling groups_inventory to load vars for managed_node1 15627 1726882476.77256: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882476.77262: Calling all_plugins_play to load vars for managed_node1 15627 1726882476.77266: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882476.77268: Calling groups_plugins_play to load vars for managed_node1 15627 1726882476.81358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882476.86166: done with get_vars() 15627 1726882476.86195: done getting variables 15627 1726882476.86379: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 21:34:36 -0400 (0:00:00.419) 0:00:16.615 ****** 15627 1726882476.86407: entering _queue_task() for managed_node1/gather_facts 15627 1726882476.87586: worker is 1 (out of 1 available) 15627 1726882476.87597: exiting _queue_task() for managed_node1/gather_facts 15627 1726882476.87695: done queuing things up, now waiting for results queue to drain 15627 1726882476.87697: waiting for pending results... 15627 1726882476.88782: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15627 1726882476.89109: in run() - task 0e448fcc-3ce9-2847-7723-00000000024e 15627 1726882476.89148: variable 'ansible_search_path' from source: unknown 15627 1726882476.89205: calling self._execute() 15627 1726882476.89323: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882476.89333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882476.89344: variable 'omit' from source: magic vars 15627 1726882476.90072: variable 'ansible_distribution_major_version' from source: facts 15627 1726882476.90096: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882476.90108: variable 'omit' from source: magic vars 15627 1726882476.90137: variable 'omit' from source: magic vars 15627 1726882476.90181: variable 'omit' from source: magic vars 15627 1726882476.90233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882476.90277: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882476.90311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882476.90335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882476.90352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882476.90391: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882476.90401: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882476.90413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882476.90577: Set connection var ansible_timeout to 10 15627 1726882476.90593: Set connection var ansible_shell_executable to /bin/sh 15627 1726882476.90602: Set connection var ansible_connection to ssh 15627 1726882476.90612: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882476.90626: Set connection var ansible_pipelining to False 15627 1726882476.90638: Set connection var ansible_shell_type to sh 15627 1726882476.90673: variable 'ansible_shell_executable' from source: unknown 15627 1726882476.90681: variable 'ansible_connection' from source: unknown 15627 1726882476.90687: variable 'ansible_module_compression' from source: unknown 15627 1726882476.90694: variable 'ansible_shell_type' from source: unknown 15627 1726882476.90740: variable 'ansible_shell_executable' from source: unknown 15627 1726882476.90752: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882476.90766: variable 'ansible_pipelining' from source: unknown 15627 1726882476.90774: variable 'ansible_timeout' from source: unknown 15627 1726882476.90782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882476.90998: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882476.91013: variable 'omit' from source: magic vars 15627 1726882476.91023: starting attempt loop 15627 1726882476.91030: running the handler 15627 1726882476.91050: variable 'ansible_facts' from source: unknown 15627 1726882476.91085: _low_level_execute_command(): starting 15627 1726882476.91099: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882476.91991: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882476.92008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.92023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.92044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.92092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.92102: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882476.92114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.92133: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882476.92150: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882476.92175: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882476.92189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.92203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.92219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.92230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.92239: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882476.92250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.92334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882476.92350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882476.92370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882476.92507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882476.94169: stdout chunk (state=3): >>>/root <<< 15627 1726882476.94357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882476.94360: stdout chunk (state=3): >>><<< 15627 1726882476.94363: stderr chunk (state=3): >>><<< 15627 1726882476.94474: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882476.94478: _low_level_execute_command(): starting 15627 1726882476.94481: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882476.9438565-16366-279005448150950 `" && echo ansible-tmp-1726882476.9438565-16366-279005448150950="` echo /root/.ansible/tmp/ansible-tmp-1726882476.9438565-16366-279005448150950 `" ) && sleep 0' 15627 1726882476.95014: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882476.95028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.95044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.95068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.95109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.95121: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882476.95136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.95157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882476.95176: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882476.95188: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882476.95201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.95216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.95233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.95247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.95262: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882476.95279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.95352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882476.95378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882476.95395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882476.95524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882476.97392: stdout chunk (state=3): >>>ansible-tmp-1726882476.9438565-16366-279005448150950=/root/.ansible/tmp/ansible-tmp-1726882476.9438565-16366-279005448150950 <<< 15627 1726882476.97500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882476.97602: stderr chunk (state=3): >>><<< 15627 1726882476.97617: stdout chunk (state=3): >>><<< 15627 1726882476.97875: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882476.9438565-16366-279005448150950=/root/.ansible/tmp/ansible-tmp-1726882476.9438565-16366-279005448150950 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882476.97878: variable 'ansible_module_compression' from source: unknown 15627 1726882476.97880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15627 1726882476.97882: variable 'ansible_facts' from source: unknown 15627 1726882476.97996: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882476.9438565-16366-279005448150950/AnsiballZ_setup.py 15627 1726882476.98162: Sending initial data 15627 1726882476.98167: Sent initial data (154 bytes) 15627 1726882476.99327: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882476.99340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.99353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.99375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.99424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.99435: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882476.99448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.99470: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882476.99481: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882476.99491: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882476.99501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882476.99521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882476.99539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882476.99553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882476.99570: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882476.99585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882476.99680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882476.99703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882476.99719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882476.99852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882477.01575: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882477.01662: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882477.01756: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmp0se4117u /root/.ansible/tmp/ansible-tmp-1726882476.9438565-16366-279005448150950/AnsiballZ_setup.py <<< 15627 1726882477.01845: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882477.04503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882477.04681: stderr chunk (state=3): >>><<< 15627 1726882477.04684: stdout chunk (state=3): >>><<< 15627 1726882477.04686: done transferring module to remote 15627 1726882477.04691: _low_level_execute_command(): starting 15627 1726882477.04694: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882476.9438565-16366-279005448150950/ /root/.ansible/tmp/ansible-tmp-1726882476.9438565-16366-279005448150950/AnsiballZ_setup.py && sleep 0' 15627 1726882477.05224: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882477.05227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882477.05276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882477.05279: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15627 1726882477.05282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882477.05284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882477.05291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882477.05333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882477.05336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882477.05431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882477.07229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882477.07301: stderr chunk (state=3): >>><<< 15627 1726882477.07304: stdout chunk (state=3): >>><<< 15627 1726882477.07393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882477.07396: _low_level_execute_command(): starting 15627 1726882477.07399: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882476.9438565-16366-279005448150950/AnsiballZ_setup.py && sleep 0' 15627 1726882477.07940: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882477.07953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882477.07969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882477.07988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882477.08028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882477.08039: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882477.08051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882477.08070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882477.08084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882477.08089: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882477.08097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882477.08106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882477.08117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882477.08124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882477.08130: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882477.08139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882477.08211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882477.08227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882477.08238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882477.08360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882477.61780: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "37", "epoch": "1726882477", "epoch_int": "1726882477", "date": "2024-09-20", "time": "21:34:37", "iso8601_micro": "2024-09-21T01:34:37.345526Z", "iso8601": "2024-09-21T01:34:37Z", "iso8601_basic": "20240920T213437345526", "iso8601_basic_short": "20240920T213437", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3276, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 635, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241389568, "block_size": 4096, "block_total": 65519355, "block_available": 64512058, "block_used": 1007297, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.5, "5m": 0.38, "15m": 0.2}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_interfaces": ["eth0", "LSR-TST-br31", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "e6:c3:a0:67:8a:2a", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15627 1726882477.63479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882477.63483: stdout chunk (state=3): >>><<< 15627 1726882477.63489: stderr chunk (state=3): >>><<< 15627 1726882477.63551: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "37", "epoch": "1726882477", "epoch_int": "1726882477", "date": "2024-09-20", "time": "21:34:37", "iso8601_micro": "2024-09-21T01:34:37.345526Z", "iso8601": "2024-09-21T01:34:37Z", "iso8601_basic": "20240920T213437345526", "iso8601_basic_short": "20240920T213437", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3276, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 635, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241389568, "block_size": 4096, "block_total": 65519355, "block_available": 64512058, "block_used": 1007297, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.5, "5m": 0.38, "15m": 0.2}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_interfaces": ["eth0", "LSR-TST-br31", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "e6:c3:a0:67:8a:2a", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882477.64474: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882476.9438565-16366-279005448150950/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882477.64483: _low_level_execute_command(): starting 15627 1726882477.64488: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882476.9438565-16366-279005448150950/ > /dev/null 2>&1 && sleep 0' 15627 1726882477.65567: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882477.65611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882477.65622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882477.65635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882477.65705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882477.65712: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882477.65741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882477.65763: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882477.65773: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882477.65779: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882477.65787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882477.65796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882477.65806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882477.65815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882477.65842: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882477.65851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882477.65937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882477.66011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882477.66048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882477.66239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882477.67992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882477.68067: stderr chunk (state=3): >>><<< 15627 1726882477.68078: stdout chunk (state=3): >>><<< 15627 1726882477.68306: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882477.68317: handler run complete 15627 1726882477.68320: variable 'ansible_facts' from source: unknown 15627 1726882477.68381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882477.68759: variable 'ansible_facts' from source: unknown 15627 1726882477.68857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882477.69052: attempt loop complete, returning result 15627 1726882477.69062: _execute() done 15627 1726882477.69071: dumping result to json 15627 1726882477.69111: done dumping result, returning 15627 1726882477.69122: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-2847-7723-00000000024e] 15627 1726882477.69139: sending task result for task 0e448fcc-3ce9-2847-7723-00000000024e ok: [managed_node1] 15627 1726882477.69913: no more pending results, returning what we have 15627 1726882477.69917: results queue empty 15627 1726882477.69918: checking for any_errors_fatal 15627 1726882477.69920: done checking for any_errors_fatal 15627 1726882477.69921: checking for max_fail_percentage 15627 1726882477.69922: done checking for max_fail_percentage 15627 1726882477.69923: checking to see if all hosts have failed and the running result is not ok 15627 1726882477.69924: done checking to see if all hosts have failed 15627 1726882477.69925: getting the remaining hosts for this loop 15627 1726882477.69927: done getting the remaining hosts for this loop 15627 1726882477.69931: getting the next task for host managed_node1 15627 1726882477.69938: done getting next task for host managed_node1 15627 1726882477.69940: ^ task is: TASK: meta (flush_handlers) 15627 1726882477.69942: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882477.69953: getting variables 15627 1726882477.69955: in VariableManager get_vars() 15627 1726882477.69989: Calling all_inventory to load vars for managed_node1 15627 1726882477.69992: Calling groups_inventory to load vars for managed_node1 15627 1726882477.69995: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882477.70014: Calling all_plugins_play to load vars for managed_node1 15627 1726882477.70018: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882477.70022: Calling groups_plugins_play to load vars for managed_node1 15627 1726882477.70877: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000024e 15627 1726882477.70881: WORKER PROCESS EXITING 15627 1726882477.72192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882477.73889: done with get_vars() 15627 1726882477.73914: done getting variables 15627 1726882477.73987: in VariableManager get_vars() 15627 1726882477.73997: Calling all_inventory to load vars for managed_node1 15627 1726882477.74000: Calling groups_inventory to load vars for managed_node1 15627 1726882477.74002: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882477.74007: Calling all_plugins_play to load vars for managed_node1 15627 1726882477.74010: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882477.74017: Calling groups_plugins_play to load vars for managed_node1 15627 1726882477.75270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882477.77646: done with get_vars() 15627 1726882477.77681: done queuing things up, now waiting for results queue to drain 15627 1726882477.77683: results queue empty 15627 1726882477.77684: checking for any_errors_fatal 15627 1726882477.77689: done checking for any_errors_fatal 15627 1726882477.77689: checking for max_fail_percentage 15627 1726882477.77690: done checking for max_fail_percentage 15627 1726882477.77691: checking to see if all hosts have failed and the running result is not ok 15627 1726882477.77692: done checking to see if all hosts have failed 15627 1726882477.77693: getting the remaining hosts for this loop 15627 1726882477.77694: done getting the remaining hosts for this loop 15627 1726882477.77696: getting the next task for host managed_node1 15627 1726882477.77700: done getting next task for host managed_node1 15627 1726882477.77702: ^ task is: TASK: Include the task '{{ task }}' 15627 1726882477.77704: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882477.77706: getting variables 15627 1726882477.77707: in VariableManager get_vars() 15627 1726882477.77715: Calling all_inventory to load vars for managed_node1 15627 1726882477.77717: Calling groups_inventory to load vars for managed_node1 15627 1726882477.77720: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882477.77725: Calling all_plugins_play to load vars for managed_node1 15627 1726882477.77727: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882477.77730: Calling groups_plugins_play to load vars for managed_node1 15627 1726882477.79085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882477.81454: done with get_vars() 15627 1726882477.81478: done getting variables 15627 1726882477.82762: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_present.yml'] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 21:34:37 -0400 (0:00:00.964) 0:00:17.580 ****** 15627 1726882477.82901: entering _queue_task() for managed_node1/include_tasks 15627 1726882477.83191: worker is 1 (out of 1 available) 15627 1726882477.83203: exiting _queue_task() for managed_node1/include_tasks 15627 1726882477.83214: done queuing things up, now waiting for results queue to drain 15627 1726882477.83215: waiting for pending results... 15627 1726882477.83773: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_present.yml' 15627 1726882477.83879: in run() - task 0e448fcc-3ce9-2847-7723-000000000031 15627 1726882477.83897: variable 'ansible_search_path' from source: unknown 15627 1726882477.84084: calling self._execute() 15627 1726882477.84171: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882477.84240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882477.84257: variable 'omit' from source: magic vars 15627 1726882477.84624: variable 'ansible_distribution_major_version' from source: facts 15627 1726882477.84640: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882477.84651: variable 'task' from source: play vars 15627 1726882477.84728: variable 'task' from source: play vars 15627 1726882477.84739: _execute() done 15627 1726882477.84745: dumping result to json 15627 1726882477.84752: done dumping result, returning 15627 1726882477.84761: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_present.yml' [0e448fcc-3ce9-2847-7723-000000000031] 15627 1726882477.84773: sending task result for task 0e448fcc-3ce9-2847-7723-000000000031 15627 1726882477.84888: no more pending results, returning what we have 15627 1726882477.84893: in VariableManager get_vars() 15627 1726882477.84926: Calling all_inventory to load vars for managed_node1 15627 1726882477.84928: Calling groups_inventory to load vars for managed_node1 15627 1726882477.84932: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882477.84946: Calling all_plugins_play to load vars for managed_node1 15627 1726882477.84950: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882477.84953: Calling groups_plugins_play to load vars for managed_node1 15627 1726882477.86571: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000031 15627 1726882477.86574: WORKER PROCESS EXITING 15627 1726882477.86871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882477.90186: done with get_vars() 15627 1726882477.90205: variable 'ansible_search_path' from source: unknown 15627 1726882477.90221: we have included files to process 15627 1726882477.90222: generating all_blocks data 15627 1726882477.90223: done generating all_blocks data 15627 1726882477.90224: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15627 1726882477.90225: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15627 1726882477.90228: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15627 1726882477.90633: in VariableManager get_vars() 15627 1726882477.90649: done with get_vars() 15627 1726882477.91200: done processing included file 15627 1726882477.91202: iterating over new_blocks loaded from include file 15627 1726882477.91203: in VariableManager get_vars() 15627 1726882477.91215: done with get_vars() 15627 1726882477.91216: filtering new block on tags 15627 1726882477.91237: done filtering new block on tags 15627 1726882477.91239: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 15627 1726882477.91244: extending task lists for all hosts with included blocks 15627 1726882477.91275: done extending task lists 15627 1726882477.91276: done processing included files 15627 1726882477.91277: results queue empty 15627 1726882477.91278: checking for any_errors_fatal 15627 1726882477.91279: done checking for any_errors_fatal 15627 1726882477.91280: checking for max_fail_percentage 15627 1726882477.91281: done checking for max_fail_percentage 15627 1726882477.91282: checking to see if all hosts have failed and the running result is not ok 15627 1726882477.91283: done checking to see if all hosts have failed 15627 1726882477.91283: getting the remaining hosts for this loop 15627 1726882477.91285: done getting the remaining hosts for this loop 15627 1726882477.91287: getting the next task for host managed_node1 15627 1726882477.91291: done getting next task for host managed_node1 15627 1726882477.91293: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15627 1726882477.91296: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882477.91299: getting variables 15627 1726882477.91300: in VariableManager get_vars() 15627 1726882477.91307: Calling all_inventory to load vars for managed_node1 15627 1726882477.91310: Calling groups_inventory to load vars for managed_node1 15627 1726882477.91312: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882477.91318: Calling all_plugins_play to load vars for managed_node1 15627 1726882477.91320: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882477.91323: Calling groups_plugins_play to load vars for managed_node1 15627 1726882478.01624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882478.04836: done with get_vars() 15627 1726882478.04860: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:34:38 -0400 (0:00:00.220) 0:00:17.800 ****** 15627 1726882478.04934: entering _queue_task() for managed_node1/include_tasks 15627 1726882478.06427: worker is 1 (out of 1 available) 15627 1726882478.06440: exiting _queue_task() for managed_node1/include_tasks 15627 1726882478.06452: done queuing things up, now waiting for results queue to drain 15627 1726882478.06454: waiting for pending results... 15627 1726882478.07579: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 15627 1726882478.07696: in run() - task 0e448fcc-3ce9-2847-7723-00000000025f 15627 1726882478.07716: variable 'ansible_search_path' from source: unknown 15627 1726882478.07723: variable 'ansible_search_path' from source: unknown 15627 1726882478.07764: calling self._execute() 15627 1726882478.07859: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882478.07873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882478.07890: variable 'omit' from source: magic vars 15627 1726882478.08268: variable 'ansible_distribution_major_version' from source: facts 15627 1726882478.08288: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882478.08302: _execute() done 15627 1726882478.08312: dumping result to json 15627 1726882478.08323: done dumping result, returning 15627 1726882478.08334: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-2847-7723-00000000025f] 15627 1726882478.08345: sending task result for task 0e448fcc-3ce9-2847-7723-00000000025f 15627 1726882478.08474: no more pending results, returning what we have 15627 1726882478.08479: in VariableManager get_vars() 15627 1726882478.08515: Calling all_inventory to load vars for managed_node1 15627 1726882478.08518: Calling groups_inventory to load vars for managed_node1 15627 1726882478.08522: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882478.08536: Calling all_plugins_play to load vars for managed_node1 15627 1726882478.08539: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882478.08543: Calling groups_plugins_play to load vars for managed_node1 15627 1726882478.09683: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000025f 15627 1726882478.09687: WORKER PROCESS EXITING 15627 1726882478.11201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882478.14576: done with get_vars() 15627 1726882478.14601: variable 'ansible_search_path' from source: unknown 15627 1726882478.14602: variable 'ansible_search_path' from source: unknown 15627 1726882478.14613: variable 'task' from source: play vars 15627 1726882478.14720: variable 'task' from source: play vars 15627 1726882478.14755: we have included files to process 15627 1726882478.14757: generating all_blocks data 15627 1726882478.14758: done generating all_blocks data 15627 1726882478.14760: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15627 1726882478.14761: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15627 1726882478.14764: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15627 1726882478.15803: done processing included file 15627 1726882478.15805: iterating over new_blocks loaded from include file 15627 1726882478.15807: in VariableManager get_vars() 15627 1726882478.15819: done with get_vars() 15627 1726882478.15821: filtering new block on tags 15627 1726882478.15844: done filtering new block on tags 15627 1726882478.15847: in VariableManager get_vars() 15627 1726882478.15858: done with get_vars() 15627 1726882478.15859: filtering new block on tags 15627 1726882478.15883: done filtering new block on tags 15627 1726882478.15885: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 15627 1726882478.15890: extending task lists for all hosts with included blocks 15627 1726882478.16096: done extending task lists 15627 1726882478.16098: done processing included files 15627 1726882478.16099: results queue empty 15627 1726882478.16099: checking for any_errors_fatal 15627 1726882478.16103: done checking for any_errors_fatal 15627 1726882478.16103: checking for max_fail_percentage 15627 1726882478.16105: done checking for max_fail_percentage 15627 1726882478.16105: checking to see if all hosts have failed and the running result is not ok 15627 1726882478.16106: done checking to see if all hosts have failed 15627 1726882478.16107: getting the remaining hosts for this loop 15627 1726882478.16108: done getting the remaining hosts for this loop 15627 1726882478.16111: getting the next task for host managed_node1 15627 1726882478.16115: done getting next task for host managed_node1 15627 1726882478.16117: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15627 1726882478.16120: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882478.16122: getting variables 15627 1726882478.16123: in VariableManager get_vars() 15627 1726882478.16130: Calling all_inventory to load vars for managed_node1 15627 1726882478.16132: Calling groups_inventory to load vars for managed_node1 15627 1726882478.16135: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882478.16140: Calling all_plugins_play to load vars for managed_node1 15627 1726882478.16142: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882478.16145: Calling groups_plugins_play to load vars for managed_node1 15627 1726882478.17363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882478.20114: done with get_vars() 15627 1726882478.20141: done getting variables 15627 1726882478.20192: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:34:38 -0400 (0:00:00.152) 0:00:17.953 ****** 15627 1726882478.20224: entering _queue_task() for managed_node1/set_fact 15627 1726882478.20573: worker is 1 (out of 1 available) 15627 1726882478.20589: exiting _queue_task() for managed_node1/set_fact 15627 1726882478.20603: done queuing things up, now waiting for results queue to drain 15627 1726882478.20604: waiting for pending results... 15627 1726882478.20894: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 15627 1726882478.21031: in run() - task 0e448fcc-3ce9-2847-7723-00000000026c 15627 1726882478.21056: variable 'ansible_search_path' from source: unknown 15627 1726882478.21068: variable 'ansible_search_path' from source: unknown 15627 1726882478.21107: calling self._execute() 15627 1726882478.21212: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882478.21225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882478.21245: variable 'omit' from source: magic vars 15627 1726882478.21633: variable 'ansible_distribution_major_version' from source: facts 15627 1726882478.21652: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882478.21667: variable 'omit' from source: magic vars 15627 1726882478.21725: variable 'omit' from source: magic vars 15627 1726882478.21766: variable 'omit' from source: magic vars 15627 1726882478.21818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882478.21858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882478.21888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882478.21920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882478.21941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882478.21977: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882478.21987: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882478.21995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882478.22102: Set connection var ansible_timeout to 10 15627 1726882478.22124: Set connection var ansible_shell_executable to /bin/sh 15627 1726882478.22137: Set connection var ansible_connection to ssh 15627 1726882478.22147: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882478.22157: Set connection var ansible_pipelining to False 15627 1726882478.22167: Set connection var ansible_shell_type to sh 15627 1726882478.22195: variable 'ansible_shell_executable' from source: unknown 15627 1726882478.22203: variable 'ansible_connection' from source: unknown 15627 1726882478.22210: variable 'ansible_module_compression' from source: unknown 15627 1726882478.22220: variable 'ansible_shell_type' from source: unknown 15627 1726882478.22229: variable 'ansible_shell_executable' from source: unknown 15627 1726882478.22240: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882478.22248: variable 'ansible_pipelining' from source: unknown 15627 1726882478.22255: variable 'ansible_timeout' from source: unknown 15627 1726882478.22265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882478.23082: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882478.23242: variable 'omit' from source: magic vars 15627 1726882478.23251: starting attempt loop 15627 1726882478.23257: running the handler 15627 1726882478.23277: handler run complete 15627 1726882478.23291: attempt loop complete, returning result 15627 1726882478.23297: _execute() done 15627 1726882478.23302: dumping result to json 15627 1726882478.23308: done dumping result, returning 15627 1726882478.23318: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-2847-7723-00000000026c] 15627 1726882478.23327: sending task result for task 0e448fcc-3ce9-2847-7723-00000000026c ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15627 1726882478.23495: no more pending results, returning what we have 15627 1726882478.23497: results queue empty 15627 1726882478.23498: checking for any_errors_fatal 15627 1726882478.23501: done checking for any_errors_fatal 15627 1726882478.23501: checking for max_fail_percentage 15627 1726882478.23503: done checking for max_fail_percentage 15627 1726882478.23504: checking to see if all hosts have failed and the running result is not ok 15627 1726882478.23506: done checking to see if all hosts have failed 15627 1726882478.23506: getting the remaining hosts for this loop 15627 1726882478.23508: done getting the remaining hosts for this loop 15627 1726882478.23512: getting the next task for host managed_node1 15627 1726882478.23521: done getting next task for host managed_node1 15627 1726882478.23524: ^ task is: TASK: Stat profile file 15627 1726882478.23529: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882478.23533: getting variables 15627 1726882478.23535: in VariableManager get_vars() 15627 1726882478.23562: Calling all_inventory to load vars for managed_node1 15627 1726882478.23567: Calling groups_inventory to load vars for managed_node1 15627 1726882478.23570: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882478.23581: Calling all_plugins_play to load vars for managed_node1 15627 1726882478.23584: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882478.23587: Calling groups_plugins_play to load vars for managed_node1 15627 1726882478.24770: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000026c 15627 1726882478.24773: WORKER PROCESS EXITING 15627 1726882478.26154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882478.29284: done with get_vars() 15627 1726882478.29311: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:34:38 -0400 (0:00:00.091) 0:00:18.045 ****** 15627 1726882478.29406: entering _queue_task() for managed_node1/stat 15627 1726882478.30073: worker is 1 (out of 1 available) 15627 1726882478.30086: exiting _queue_task() for managed_node1/stat 15627 1726882478.30097: done queuing things up, now waiting for results queue to drain 15627 1726882478.30099: waiting for pending results... 15627 1726882478.30383: running TaskExecutor() for managed_node1/TASK: Stat profile file 15627 1726882478.30514: in run() - task 0e448fcc-3ce9-2847-7723-00000000026d 15627 1726882478.30536: variable 'ansible_search_path' from source: unknown 15627 1726882478.30546: variable 'ansible_search_path' from source: unknown 15627 1726882478.30591: calling self._execute() 15627 1726882478.30693: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882478.30704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882478.30720: variable 'omit' from source: magic vars 15627 1726882478.31120: variable 'ansible_distribution_major_version' from source: facts 15627 1726882478.31142: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882478.31154: variable 'omit' from source: magic vars 15627 1726882478.31208: variable 'omit' from source: magic vars 15627 1726882478.31315: variable 'profile' from source: play vars 15627 1726882478.31325: variable 'interface' from source: set_fact 15627 1726882478.31398: variable 'interface' from source: set_fact 15627 1726882478.31426: variable 'omit' from source: magic vars 15627 1726882478.31475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882478.31515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882478.31545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882478.31576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882478.31596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882478.31634: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882478.31643: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882478.31650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882478.31759: Set connection var ansible_timeout to 10 15627 1726882478.31777: Set connection var ansible_shell_executable to /bin/sh 15627 1726882478.31791: Set connection var ansible_connection to ssh 15627 1726882478.31801: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882478.31812: Set connection var ansible_pipelining to False 15627 1726882478.31819: Set connection var ansible_shell_type to sh 15627 1726882478.31850: variable 'ansible_shell_executable' from source: unknown 15627 1726882478.31859: variable 'ansible_connection' from source: unknown 15627 1726882478.31868: variable 'ansible_module_compression' from source: unknown 15627 1726882478.31876: variable 'ansible_shell_type' from source: unknown 15627 1726882478.31882: variable 'ansible_shell_executable' from source: unknown 15627 1726882478.31892: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882478.31900: variable 'ansible_pipelining' from source: unknown 15627 1726882478.31907: variable 'ansible_timeout' from source: unknown 15627 1726882478.31914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882478.32133: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882478.32149: variable 'omit' from source: magic vars 15627 1726882478.32158: starting attempt loop 15627 1726882478.32166: running the handler 15627 1726882478.32186: _low_level_execute_command(): starting 15627 1726882478.32198: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882478.32999: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882478.33016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.33035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.33056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.33108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882478.33119: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882478.33133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.33154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882478.33170: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882478.33183: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882478.33200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.33214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.33231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.33244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882478.33258: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882478.33277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.33359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882478.33379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882478.33393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882478.33538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882478.35203: stdout chunk (state=3): >>>/root <<< 15627 1726882478.35378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882478.35381: stdout chunk (state=3): >>><<< 15627 1726882478.35384: stderr chunk (state=3): >>><<< 15627 1726882478.35489: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882478.35494: _low_level_execute_command(): starting 15627 1726882478.35498: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882478.3540373-16421-204213620914374 `" && echo ansible-tmp-1726882478.3540373-16421-204213620914374="` echo /root/.ansible/tmp/ansible-tmp-1726882478.3540373-16421-204213620914374 `" ) && sleep 0' 15627 1726882478.37082: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882478.37097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.37113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.37155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.37209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882478.37221: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882478.37234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.37257: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882478.37272: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882478.37284: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882478.37297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.37310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.37326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.37348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882478.37365: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882478.37383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.37456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882478.37479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882478.37494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882478.37856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882478.39511: stdout chunk (state=3): >>>ansible-tmp-1726882478.3540373-16421-204213620914374=/root/.ansible/tmp/ansible-tmp-1726882478.3540373-16421-204213620914374 <<< 15627 1726882478.39690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882478.39693: stdout chunk (state=3): >>><<< 15627 1726882478.39696: stderr chunk (state=3): >>><<< 15627 1726882478.39972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882478.3540373-16421-204213620914374=/root/.ansible/tmp/ansible-tmp-1726882478.3540373-16421-204213620914374 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882478.39977: variable 'ansible_module_compression' from source: unknown 15627 1726882478.39979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15627 1726882478.39981: variable 'ansible_facts' from source: unknown 15627 1726882478.39983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882478.3540373-16421-204213620914374/AnsiballZ_stat.py 15627 1726882478.40489: Sending initial data 15627 1726882478.40492: Sent initial data (153 bytes) 15627 1726882478.41438: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882478.41456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.41476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.41496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.41535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882478.41553: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882478.41574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.41593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882478.41605: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882478.41617: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882478.41630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.41645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.41671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.41685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882478.41697: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882478.41712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.41793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882478.41814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882478.41829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882478.41959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882478.43689: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882478.43783: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882478.43887: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmphrr7_2al /root/.ansible/tmp/ansible-tmp-1726882478.3540373-16421-204213620914374/AnsiballZ_stat.py <<< 15627 1726882478.43981: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882478.45621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882478.45744: stderr chunk (state=3): >>><<< 15627 1726882478.45748: stdout chunk (state=3): >>><<< 15627 1726882478.45750: done transferring module to remote 15627 1726882478.45752: _low_level_execute_command(): starting 15627 1726882478.45762: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882478.3540373-16421-204213620914374/ /root/.ansible/tmp/ansible-tmp-1726882478.3540373-16421-204213620914374/AnsiballZ_stat.py && sleep 0' 15627 1726882478.47105: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.47109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.47270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882478.47273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.47276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882478.47278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.47333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882478.47450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882478.47669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882478.49313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882478.49384: stderr chunk (state=3): >>><<< 15627 1726882478.49387: stdout chunk (state=3): >>><<< 15627 1726882478.49484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882478.49488: _low_level_execute_command(): starting 15627 1726882478.49490: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882478.3540373-16421-204213620914374/AnsiballZ_stat.py && sleep 0' 15627 1726882478.50860: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.50865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.50909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882478.50912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.50914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882478.50917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.51165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882478.51183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882478.51347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882478.64268: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15627 1726882478.65852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882478.65859: stdout chunk (state=3): >>><<< 15627 1726882478.65862: stderr chunk (state=3): >>><<< 15627 1726882478.65978: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882478.65982: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882478.3540373-16421-204213620914374/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882478.65985: _low_level_execute_command(): starting 15627 1726882478.65990: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882478.3540373-16421-204213620914374/ > /dev/null 2>&1 && sleep 0' 15627 1726882478.66647: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882478.66670: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.66690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.66718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.66783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882478.66795: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882478.66807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.66823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882478.66839: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882478.66850: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882478.66873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.66887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.66901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.66918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882478.66928: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882478.66945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.67038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882478.67041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882478.67147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882478.69098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882478.69101: stdout chunk (state=3): >>><<< 15627 1726882478.69104: stderr chunk (state=3): >>><<< 15627 1726882478.69269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882478.69272: handler run complete 15627 1726882478.69275: attempt loop complete, returning result 15627 1726882478.69277: _execute() done 15627 1726882478.69279: dumping result to json 15627 1726882478.69280: done dumping result, returning 15627 1726882478.69282: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0e448fcc-3ce9-2847-7723-00000000026d] 15627 1726882478.69284: sending task result for task 0e448fcc-3ce9-2847-7723-00000000026d 15627 1726882478.69356: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000026d 15627 1726882478.69359: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15627 1726882478.69427: no more pending results, returning what we have 15627 1726882478.69431: results queue empty 15627 1726882478.69432: checking for any_errors_fatal 15627 1726882478.69437: done checking for any_errors_fatal 15627 1726882478.69438: checking for max_fail_percentage 15627 1726882478.69443: done checking for max_fail_percentage 15627 1726882478.69444: checking to see if all hosts have failed and the running result is not ok 15627 1726882478.69445: done checking to see if all hosts have failed 15627 1726882478.69446: getting the remaining hosts for this loop 15627 1726882478.69448: done getting the remaining hosts for this loop 15627 1726882478.69452: getting the next task for host managed_node1 15627 1726882478.69462: done getting next task for host managed_node1 15627 1726882478.69466: ^ task is: TASK: Set NM profile exist flag based on the profile files 15627 1726882478.69470: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882478.69480: getting variables 15627 1726882478.69482: in VariableManager get_vars() 15627 1726882478.69511: Calling all_inventory to load vars for managed_node1 15627 1726882478.69514: Calling groups_inventory to load vars for managed_node1 15627 1726882478.69517: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882478.69529: Calling all_plugins_play to load vars for managed_node1 15627 1726882478.69532: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882478.69534: Calling groups_plugins_play to load vars for managed_node1 15627 1726882478.71956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882478.72935: done with get_vars() 15627 1726882478.72951: done getting variables 15627 1726882478.72998: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:34:38 -0400 (0:00:00.436) 0:00:18.481 ****** 15627 1726882478.73020: entering _queue_task() for managed_node1/set_fact 15627 1726882478.73240: worker is 1 (out of 1 available) 15627 1726882478.73251: exiting _queue_task() for managed_node1/set_fact 15627 1726882478.73266: done queuing things up, now waiting for results queue to drain 15627 1726882478.73267: waiting for pending results... 15627 1726882478.73433: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 15627 1726882478.73514: in run() - task 0e448fcc-3ce9-2847-7723-00000000026e 15627 1726882478.73525: variable 'ansible_search_path' from source: unknown 15627 1726882478.73528: variable 'ansible_search_path' from source: unknown 15627 1726882478.73562: calling self._execute() 15627 1726882478.73681: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882478.73699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882478.73722: variable 'omit' from source: magic vars 15627 1726882478.74828: variable 'ansible_distribution_major_version' from source: facts 15627 1726882478.74849: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882478.75115: variable 'profile_stat' from source: set_fact 15627 1726882478.75308: Evaluated conditional (profile_stat.stat.exists): False 15627 1726882478.75318: when evaluation is False, skipping this task 15627 1726882478.75326: _execute() done 15627 1726882478.75337: dumping result to json 15627 1726882478.75346: done dumping result, returning 15627 1726882478.75355: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-2847-7723-00000000026e] 15627 1726882478.75376: sending task result for task 0e448fcc-3ce9-2847-7723-00000000026e skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15627 1726882478.75617: no more pending results, returning what we have 15627 1726882478.75623: results queue empty 15627 1726882478.75624: checking for any_errors_fatal 15627 1726882478.75658: done checking for any_errors_fatal 15627 1726882478.75660: checking for max_fail_percentage 15627 1726882478.75663: done checking for max_fail_percentage 15627 1726882478.75664: checking to see if all hosts have failed and the running result is not ok 15627 1726882478.75718: done checking to see if all hosts have failed 15627 1726882478.75719: getting the remaining hosts for this loop 15627 1726882478.75722: done getting the remaining hosts for this loop 15627 1726882478.75726: getting the next task for host managed_node1 15627 1726882478.75734: done getting next task for host managed_node1 15627 1726882478.75737: ^ task is: TASK: Get NM profile info 15627 1726882478.75743: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882478.75749: getting variables 15627 1726882478.75808: in VariableManager get_vars() 15627 1726882478.75841: Calling all_inventory to load vars for managed_node1 15627 1726882478.75846: Calling groups_inventory to load vars for managed_node1 15627 1726882478.75852: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882478.75948: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000026e 15627 1726882478.75953: WORKER PROCESS EXITING 15627 1726882478.75997: Calling all_plugins_play to load vars for managed_node1 15627 1726882478.76001: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882478.76005: Calling groups_plugins_play to load vars for managed_node1 15627 1726882478.77573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882478.79183: done with get_vars() 15627 1726882478.79199: done getting variables 15627 1726882478.79367: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:34:38 -0400 (0:00:00.063) 0:00:18.545 ****** 15627 1726882478.79434: entering _queue_task() for managed_node1/shell 15627 1726882478.79436: Creating lock for shell 15627 1726882478.79832: worker is 1 (out of 1 available) 15627 1726882478.79859: exiting _queue_task() for managed_node1/shell 15627 1726882478.79900: done queuing things up, now waiting for results queue to drain 15627 1726882478.79902: waiting for pending results... 15627 1726882478.80378: running TaskExecutor() for managed_node1/TASK: Get NM profile info 15627 1726882478.80618: in run() - task 0e448fcc-3ce9-2847-7723-00000000026f 15627 1726882478.80646: variable 'ansible_search_path' from source: unknown 15627 1726882478.80666: variable 'ansible_search_path' from source: unknown 15627 1726882478.80734: calling self._execute() 15627 1726882478.80924: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882478.80946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882478.80986: variable 'omit' from source: magic vars 15627 1726882478.81942: variable 'ansible_distribution_major_version' from source: facts 15627 1726882478.81982: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882478.81995: variable 'omit' from source: magic vars 15627 1726882478.82032: variable 'omit' from source: magic vars 15627 1726882478.82126: variable 'profile' from source: play vars 15627 1726882478.82137: variable 'interface' from source: set_fact 15627 1726882478.82189: variable 'interface' from source: set_fact 15627 1726882478.82227: variable 'omit' from source: magic vars 15627 1726882478.82314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882478.82391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882478.82420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882478.82461: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882478.82503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882478.82533: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882478.82537: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882478.82540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882478.82702: Set connection var ansible_timeout to 10 15627 1726882478.82709: Set connection var ansible_shell_executable to /bin/sh 15627 1726882478.82714: Set connection var ansible_connection to ssh 15627 1726882478.82720: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882478.82727: Set connection var ansible_pipelining to False 15627 1726882478.82739: Set connection var ansible_shell_type to sh 15627 1726882478.82767: variable 'ansible_shell_executable' from source: unknown 15627 1726882478.82772: variable 'ansible_connection' from source: unknown 15627 1726882478.82775: variable 'ansible_module_compression' from source: unknown 15627 1726882478.82777: variable 'ansible_shell_type' from source: unknown 15627 1726882478.82780: variable 'ansible_shell_executable' from source: unknown 15627 1726882478.82782: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882478.82784: variable 'ansible_pipelining' from source: unknown 15627 1726882478.82787: variable 'ansible_timeout' from source: unknown 15627 1726882478.82789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882478.82940: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882478.82948: variable 'omit' from source: magic vars 15627 1726882478.82952: starting attempt loop 15627 1726882478.82955: running the handler 15627 1726882478.82980: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882478.82998: _low_level_execute_command(): starting 15627 1726882478.83005: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882478.83546: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.83579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.83593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.83648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882478.83657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882478.83773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882478.85424: stdout chunk (state=3): >>>/root <<< 15627 1726882478.85551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882478.85789: stderr chunk (state=3): >>><<< 15627 1726882478.85807: stdout chunk (state=3): >>><<< 15627 1726882478.85845: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882478.85865: _low_level_execute_command(): starting 15627 1726882478.85873: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882478.8585105-16443-87395876269451 `" && echo ansible-tmp-1726882478.8585105-16443-87395876269451="` echo /root/.ansible/tmp/ansible-tmp-1726882478.8585105-16443-87395876269451 `" ) && sleep 0' 15627 1726882478.86889: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882478.86913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.87021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.87041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882478.87081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882478.87142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.87249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882478.87253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882478.87265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882478.87382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882478.89263: stdout chunk (state=3): >>>ansible-tmp-1726882478.8585105-16443-87395876269451=/root/.ansible/tmp/ansible-tmp-1726882478.8585105-16443-87395876269451 <<< 15627 1726882478.89446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882478.89449: stdout chunk (state=3): >>><<< 15627 1726882478.89476: stderr chunk (state=3): >>><<< 15627 1726882478.89494: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882478.8585105-16443-87395876269451=/root/.ansible/tmp/ansible-tmp-1726882478.8585105-16443-87395876269451 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882478.89539: variable 'ansible_module_compression' from source: unknown 15627 1726882478.89613: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15627 1726882478.89662: variable 'ansible_facts' from source: unknown 15627 1726882478.89772: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882478.8585105-16443-87395876269451/AnsiballZ_command.py 15627 1726882478.89992: Sending initial data 15627 1726882478.90000: Sent initial data (155 bytes) 15627 1726882478.91489: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882478.91492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.91534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.91539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.91617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882478.91630: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882478.91640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.91658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882478.91665: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882478.91707: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882478.91770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.91814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.91833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.91836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882478.91842: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882478.91856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.91937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882478.91951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882478.91964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882478.92098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882478.93826: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882478.93919: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882478.94020: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmp4n4fg9v7 /root/.ansible/tmp/ansible-tmp-1726882478.8585105-16443-87395876269451/AnsiballZ_command.py <<< 15627 1726882478.94111: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882478.95244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882478.95369: stderr chunk (state=3): >>><<< 15627 1726882478.95372: stdout chunk (state=3): >>><<< 15627 1726882478.95380: done transferring module to remote 15627 1726882478.95405: _low_level_execute_command(): starting 15627 1726882478.95409: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882478.8585105-16443-87395876269451/ /root/.ansible/tmp/ansible-tmp-1726882478.8585105-16443-87395876269451/AnsiballZ_command.py && sleep 0' 15627 1726882478.96006: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.96020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.96080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.96087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.96089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882478.96091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.96131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882478.96134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882478.96233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882478.98080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882478.98258: stderr chunk (state=3): >>><<< 15627 1726882478.98262: stdout chunk (state=3): >>><<< 15627 1726882478.98294: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882478.98297: _low_level_execute_command(): starting 15627 1726882478.98300: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882478.8585105-16443-87395876269451/AnsiballZ_command.py && sleep 0' 15627 1726882478.98947: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882478.98961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882478.98979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882478.99004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882478.99052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882478.99068: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882478.99090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882478.99094: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882478.99096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882478.99153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882478.99156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882478.99158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882478.99274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882479.13988: stdout chunk (state=3): >>> {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 21:34:39.120322", "end": "2024-09-20 21:34:39.138391", "delta": "0:00:00.018069", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15627 1726882479.15193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882479.15197: stdout chunk (state=3): >>><<< 15627 1726882479.15199: stderr chunk (state=3): >>><<< 15627 1726882479.15342: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 21:34:39.120322", "end": "2024-09-20 21:34:39.138391", "delta": "0:00:00.018069", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882479.15346: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882478.8585105-16443-87395876269451/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882479.15352: _low_level_execute_command(): starting 15627 1726882479.15356: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882478.8585105-16443-87395876269451/ > /dev/null 2>&1 && sleep 0' 15627 1726882479.16740: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882479.16744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882479.17406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882479.17411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882479.17413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882479.17470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882479.17485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882479.17595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882479.19413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882479.19488: stderr chunk (state=3): >>><<< 15627 1726882479.19492: stdout chunk (state=3): >>><<< 15627 1726882479.19777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882479.19781: handler run complete 15627 1726882479.19783: Evaluated conditional (False): False 15627 1726882479.19785: attempt loop complete, returning result 15627 1726882479.19787: _execute() done 15627 1726882479.19790: dumping result to json 15627 1726882479.19791: done dumping result, returning 15627 1726882479.19793: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0e448fcc-3ce9-2847-7723-00000000026f] 15627 1726882479.19795: sending task result for task 0e448fcc-3ce9-2847-7723-00000000026f 15627 1726882479.19872: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000026f 15627 1726882479.19876: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.018069", "end": "2024-09-20 21:34:39.138391", "rc": 0, "start": "2024-09-20 21:34:39.120322" } STDOUT: LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection 15627 1726882479.19953: no more pending results, returning what we have 15627 1726882479.19958: results queue empty 15627 1726882479.19959: checking for any_errors_fatal 15627 1726882479.19966: done checking for any_errors_fatal 15627 1726882479.19967: checking for max_fail_percentage 15627 1726882479.19969: done checking for max_fail_percentage 15627 1726882479.19970: checking to see if all hosts have failed and the running result is not ok 15627 1726882479.19971: done checking to see if all hosts have failed 15627 1726882479.19972: getting the remaining hosts for this loop 15627 1726882479.19974: done getting the remaining hosts for this loop 15627 1726882479.19978: getting the next task for host managed_node1 15627 1726882479.19987: done getting next task for host managed_node1 15627 1726882479.19990: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15627 1726882479.19994: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882479.19998: getting variables 15627 1726882479.19999: in VariableManager get_vars() 15627 1726882479.20028: Calling all_inventory to load vars for managed_node1 15627 1726882479.20030: Calling groups_inventory to load vars for managed_node1 15627 1726882479.20034: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882479.20046: Calling all_plugins_play to load vars for managed_node1 15627 1726882479.20049: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882479.20052: Calling groups_plugins_play to load vars for managed_node1 15627 1726882479.23714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882479.26821: done with get_vars() 15627 1726882479.26845: done getting variables 15627 1726882479.27611: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:34:39 -0400 (0:00:00.482) 0:00:19.028 ****** 15627 1726882479.27644: entering _queue_task() for managed_node1/set_fact 15627 1726882479.27965: worker is 1 (out of 1 available) 15627 1726882479.27977: exiting _queue_task() for managed_node1/set_fact 15627 1726882479.27990: done queuing things up, now waiting for results queue to drain 15627 1726882479.27991: waiting for pending results... 15627 1726882479.28944: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15627 1726882479.29235: in run() - task 0e448fcc-3ce9-2847-7723-000000000270 15627 1726882479.29313: variable 'ansible_search_path' from source: unknown 15627 1726882479.30121: variable 'ansible_search_path' from source: unknown 15627 1726882479.30167: calling self._execute() 15627 1726882479.30313: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.30560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.30674: variable 'omit' from source: magic vars 15627 1726882479.31399: variable 'ansible_distribution_major_version' from source: facts 15627 1726882479.31484: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882479.31736: variable 'nm_profile_exists' from source: set_fact 15627 1726882479.31874: Evaluated conditional (nm_profile_exists.rc == 0): True 15627 1726882479.31886: variable 'omit' from source: magic vars 15627 1726882479.31933: variable 'omit' from source: magic vars 15627 1726882479.31981: variable 'omit' from source: magic vars 15627 1726882479.32111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882479.32150: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882479.32275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882479.32304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882479.32322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882479.32359: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882479.32404: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.32414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.32567: Set connection var ansible_timeout to 10 15627 1726882479.32683: Set connection var ansible_shell_executable to /bin/sh 15627 1726882479.32695: Set connection var ansible_connection to ssh 15627 1726882479.32706: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882479.32718: Set connection var ansible_pipelining to False 15627 1726882479.32729: Set connection var ansible_shell_type to sh 15627 1726882479.32862: variable 'ansible_shell_executable' from source: unknown 15627 1726882479.32875: variable 'ansible_connection' from source: unknown 15627 1726882479.32884: variable 'ansible_module_compression' from source: unknown 15627 1726882479.32892: variable 'ansible_shell_type' from source: unknown 15627 1726882479.32899: variable 'ansible_shell_executable' from source: unknown 15627 1726882479.32906: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.32915: variable 'ansible_pipelining' from source: unknown 15627 1726882479.32921: variable 'ansible_timeout' from source: unknown 15627 1726882479.32929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.33189: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882479.33204: variable 'omit' from source: magic vars 15627 1726882479.33280: starting attempt loop 15627 1726882479.33287: running the handler 15627 1726882479.33303: handler run complete 15627 1726882479.33316: attempt loop complete, returning result 15627 1726882479.33322: _execute() done 15627 1726882479.33327: dumping result to json 15627 1726882479.33333: done dumping result, returning 15627 1726882479.33344: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-2847-7723-000000000270] 15627 1726882479.33353: sending task result for task 0e448fcc-3ce9-2847-7723-000000000270 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 15627 1726882479.33544: no more pending results, returning what we have 15627 1726882479.33547: results queue empty 15627 1726882479.33548: checking for any_errors_fatal 15627 1726882479.33555: done checking for any_errors_fatal 15627 1726882479.33556: checking for max_fail_percentage 15627 1726882479.33558: done checking for max_fail_percentage 15627 1726882479.33559: checking to see if all hosts have failed and the running result is not ok 15627 1726882479.33560: done checking to see if all hosts have failed 15627 1726882479.33561: getting the remaining hosts for this loop 15627 1726882479.33562: done getting the remaining hosts for this loop 15627 1726882479.33568: getting the next task for host managed_node1 15627 1726882479.33579: done getting next task for host managed_node1 15627 1726882479.33581: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15627 1726882479.33586: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882479.33590: getting variables 15627 1726882479.33591: in VariableManager get_vars() 15627 1726882479.33621: Calling all_inventory to load vars for managed_node1 15627 1726882479.33623: Calling groups_inventory to load vars for managed_node1 15627 1726882479.33627: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882479.33638: Calling all_plugins_play to load vars for managed_node1 15627 1726882479.33641: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882479.33643: Calling groups_plugins_play to load vars for managed_node1 15627 1726882479.34987: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000270 15627 1726882479.34991: WORKER PROCESS EXITING 15627 1726882479.35501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882479.39430: done with get_vars() 15627 1726882479.39454: done getting variables 15627 1726882479.39514: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882479.39629: variable 'profile' from source: play vars 15627 1726882479.39633: variable 'interface' from source: set_fact 15627 1726882479.39696: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:34:39 -0400 (0:00:00.120) 0:00:19.148 ****** 15627 1726882479.39731: entering _queue_task() for managed_node1/command 15627 1726882479.40721: worker is 1 (out of 1 available) 15627 1726882479.40731: exiting _queue_task() for managed_node1/command 15627 1726882479.40743: done queuing things up, now waiting for results queue to drain 15627 1726882479.40744: waiting for pending results... 15627 1726882479.41531: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15627 1726882479.41891: in run() - task 0e448fcc-3ce9-2847-7723-000000000272 15627 1726882479.41913: variable 'ansible_search_path' from source: unknown 15627 1726882479.41917: variable 'ansible_search_path' from source: unknown 15627 1726882479.41948: calling self._execute() 15627 1726882479.42150: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.42167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.42182: variable 'omit' from source: magic vars 15627 1726882479.42816: variable 'ansible_distribution_major_version' from source: facts 15627 1726882479.42833: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882479.42963: variable 'profile_stat' from source: set_fact 15627 1726882479.42989: Evaluated conditional (profile_stat.stat.exists): False 15627 1726882479.42997: when evaluation is False, skipping this task 15627 1726882479.43004: _execute() done 15627 1726882479.43011: dumping result to json 15627 1726882479.43018: done dumping result, returning 15627 1726882479.43028: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [0e448fcc-3ce9-2847-7723-000000000272] 15627 1726882479.43038: sending task result for task 0e448fcc-3ce9-2847-7723-000000000272 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15627 1726882479.43181: no more pending results, returning what we have 15627 1726882479.43185: results queue empty 15627 1726882479.43186: checking for any_errors_fatal 15627 1726882479.43191: done checking for any_errors_fatal 15627 1726882479.43192: checking for max_fail_percentage 15627 1726882479.43193: done checking for max_fail_percentage 15627 1726882479.43194: checking to see if all hosts have failed and the running result is not ok 15627 1726882479.43195: done checking to see if all hosts have failed 15627 1726882479.43196: getting the remaining hosts for this loop 15627 1726882479.43197: done getting the remaining hosts for this loop 15627 1726882479.43201: getting the next task for host managed_node1 15627 1726882479.43209: done getting next task for host managed_node1 15627 1726882479.43211: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15627 1726882479.43215: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882479.43218: getting variables 15627 1726882479.43220: in VariableManager get_vars() 15627 1726882479.43247: Calling all_inventory to load vars for managed_node1 15627 1726882479.43249: Calling groups_inventory to load vars for managed_node1 15627 1726882479.43253: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882479.43269: Calling all_plugins_play to load vars for managed_node1 15627 1726882479.43272: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882479.43275: Calling groups_plugins_play to load vars for managed_node1 15627 1726882479.43798: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000272 15627 1726882479.43802: WORKER PROCESS EXITING 15627 1726882479.44909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882479.46746: done with get_vars() 15627 1726882479.46771: done getting variables 15627 1726882479.46831: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882479.46943: variable 'profile' from source: play vars 15627 1726882479.46947: variable 'interface' from source: set_fact 15627 1726882479.47009: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:34:39 -0400 (0:00:00.073) 0:00:19.222 ****** 15627 1726882479.47042: entering _queue_task() for managed_node1/set_fact 15627 1726882479.47316: worker is 1 (out of 1 available) 15627 1726882479.47327: exiting _queue_task() for managed_node1/set_fact 15627 1726882479.47339: done queuing things up, now waiting for results queue to drain 15627 1726882479.47340: waiting for pending results... 15627 1726882479.48197: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15627 1726882479.48408: in run() - task 0e448fcc-3ce9-2847-7723-000000000273 15627 1726882479.48421: variable 'ansible_search_path' from source: unknown 15627 1726882479.48424: variable 'ansible_search_path' from source: unknown 15627 1726882479.48480: calling self._execute() 15627 1726882479.48574: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.48593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.48611: variable 'omit' from source: magic vars 15627 1726882479.48975: variable 'ansible_distribution_major_version' from source: facts 15627 1726882479.48994: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882479.49134: variable 'profile_stat' from source: set_fact 15627 1726882479.49157: Evaluated conditional (profile_stat.stat.exists): False 15627 1726882479.49171: when evaluation is False, skipping this task 15627 1726882479.49181: _execute() done 15627 1726882479.49190: dumping result to json 15627 1726882479.49200: done dumping result, returning 15627 1726882479.49214: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [0e448fcc-3ce9-2847-7723-000000000273] 15627 1726882479.49234: sending task result for task 0e448fcc-3ce9-2847-7723-000000000273 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15627 1726882479.49380: no more pending results, returning what we have 15627 1726882479.49384: results queue empty 15627 1726882479.49386: checking for any_errors_fatal 15627 1726882479.49392: done checking for any_errors_fatal 15627 1726882479.49393: checking for max_fail_percentage 15627 1726882479.49395: done checking for max_fail_percentage 15627 1726882479.49396: checking to see if all hosts have failed and the running result is not ok 15627 1726882479.49397: done checking to see if all hosts have failed 15627 1726882479.49398: getting the remaining hosts for this loop 15627 1726882479.49399: done getting the remaining hosts for this loop 15627 1726882479.49403: getting the next task for host managed_node1 15627 1726882479.49412: done getting next task for host managed_node1 15627 1726882479.49415: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15627 1726882479.49419: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882479.49424: getting variables 15627 1726882479.49425: in VariableManager get_vars() 15627 1726882479.49453: Calling all_inventory to load vars for managed_node1 15627 1726882479.49458: Calling groups_inventory to load vars for managed_node1 15627 1726882479.49462: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882479.49478: Calling all_plugins_play to load vars for managed_node1 15627 1726882479.49481: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882479.49484: Calling groups_plugins_play to load vars for managed_node1 15627 1726882479.50684: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000273 15627 1726882479.50687: WORKER PROCESS EXITING 15627 1726882479.51421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882479.53256: done with get_vars() 15627 1726882479.53279: done getting variables 15627 1726882479.53336: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882479.53442: variable 'profile' from source: play vars 15627 1726882479.53446: variable 'interface' from source: set_fact 15627 1726882479.53511: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:34:39 -0400 (0:00:00.065) 0:00:19.287 ****** 15627 1726882479.53546: entering _queue_task() for managed_node1/command 15627 1726882479.53827: worker is 1 (out of 1 available) 15627 1726882479.53842: exiting _queue_task() for managed_node1/command 15627 1726882479.53856: done queuing things up, now waiting for results queue to drain 15627 1726882479.53860: waiting for pending results... 15627 1726882479.54335: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15627 1726882479.54456: in run() - task 0e448fcc-3ce9-2847-7723-000000000274 15627 1726882479.54485: variable 'ansible_search_path' from source: unknown 15627 1726882479.54492: variable 'ansible_search_path' from source: unknown 15627 1726882479.54551: calling self._execute() 15627 1726882479.54755: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.54778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.54803: variable 'omit' from source: magic vars 15627 1726882479.55386: variable 'ansible_distribution_major_version' from source: facts 15627 1726882479.55403: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882479.55625: variable 'profile_stat' from source: set_fact 15627 1726882479.55652: Evaluated conditional (profile_stat.stat.exists): False 15627 1726882479.55655: when evaluation is False, skipping this task 15627 1726882479.55658: _execute() done 15627 1726882479.55665: dumping result to json 15627 1726882479.55668: done dumping result, returning 15627 1726882479.55673: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [0e448fcc-3ce9-2847-7723-000000000274] 15627 1726882479.55685: sending task result for task 0e448fcc-3ce9-2847-7723-000000000274 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15627 1726882479.55832: no more pending results, returning what we have 15627 1726882479.55836: results queue empty 15627 1726882479.55837: checking for any_errors_fatal 15627 1726882479.55843: done checking for any_errors_fatal 15627 1726882479.55843: checking for max_fail_percentage 15627 1726882479.55845: done checking for max_fail_percentage 15627 1726882479.55846: checking to see if all hosts have failed and the running result is not ok 15627 1726882479.55847: done checking to see if all hosts have failed 15627 1726882479.55847: getting the remaining hosts for this loop 15627 1726882479.55849: done getting the remaining hosts for this loop 15627 1726882479.55852: getting the next task for host managed_node1 15627 1726882479.55859: done getting next task for host managed_node1 15627 1726882479.55861: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15627 1726882479.55868: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882479.55871: getting variables 15627 1726882479.55872: in VariableManager get_vars() 15627 1726882479.55896: Calling all_inventory to load vars for managed_node1 15627 1726882479.55898: Calling groups_inventory to load vars for managed_node1 15627 1726882479.55901: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882479.55913: Calling all_plugins_play to load vars for managed_node1 15627 1726882479.55916: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882479.55919: Calling groups_plugins_play to load vars for managed_node1 15627 1726882479.56468: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000274 15627 1726882479.56472: WORKER PROCESS EXITING 15627 1726882479.56705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882479.58609: done with get_vars() 15627 1726882479.58628: done getting variables 15627 1726882479.58687: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882479.58848: variable 'profile' from source: play vars 15627 1726882479.58852: variable 'interface' from source: set_fact 15627 1726882479.58935: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:34:39 -0400 (0:00:00.054) 0:00:19.341 ****** 15627 1726882479.58962: entering _queue_task() for managed_node1/set_fact 15627 1726882479.59166: worker is 1 (out of 1 available) 15627 1726882479.59177: exiting _queue_task() for managed_node1/set_fact 15627 1726882479.59189: done queuing things up, now waiting for results queue to drain 15627 1726882479.59191: waiting for pending results... 15627 1726882479.59351: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15627 1726882479.59434: in run() - task 0e448fcc-3ce9-2847-7723-000000000275 15627 1726882479.59445: variable 'ansible_search_path' from source: unknown 15627 1726882479.59449: variable 'ansible_search_path' from source: unknown 15627 1726882479.59477: calling self._execute() 15627 1726882479.59543: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.59548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.59559: variable 'omit' from source: magic vars 15627 1726882479.59802: variable 'ansible_distribution_major_version' from source: facts 15627 1726882479.59817: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882479.59897: variable 'profile_stat' from source: set_fact 15627 1726882479.59907: Evaluated conditional (profile_stat.stat.exists): False 15627 1726882479.59910: when evaluation is False, skipping this task 15627 1726882479.59913: _execute() done 15627 1726882479.59915: dumping result to json 15627 1726882479.59919: done dumping result, returning 15627 1726882479.59922: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [0e448fcc-3ce9-2847-7723-000000000275] 15627 1726882479.59928: sending task result for task 0e448fcc-3ce9-2847-7723-000000000275 15627 1726882479.60011: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000275 15627 1726882479.60014: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15627 1726882479.60057: no more pending results, returning what we have 15627 1726882479.60061: results queue empty 15627 1726882479.60062: checking for any_errors_fatal 15627 1726882479.60070: done checking for any_errors_fatal 15627 1726882479.60071: checking for max_fail_percentage 15627 1726882479.60073: done checking for max_fail_percentage 15627 1726882479.60073: checking to see if all hosts have failed and the running result is not ok 15627 1726882479.60075: done checking to see if all hosts have failed 15627 1726882479.60075: getting the remaining hosts for this loop 15627 1726882479.60077: done getting the remaining hosts for this loop 15627 1726882479.60080: getting the next task for host managed_node1 15627 1726882479.60088: done getting next task for host managed_node1 15627 1726882479.60090: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 15627 1726882479.60093: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882479.60100: getting variables 15627 1726882479.60101: in VariableManager get_vars() 15627 1726882479.60127: Calling all_inventory to load vars for managed_node1 15627 1726882479.60130: Calling groups_inventory to load vars for managed_node1 15627 1726882479.60133: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882479.60143: Calling all_plugins_play to load vars for managed_node1 15627 1726882479.60146: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882479.60149: Calling groups_plugins_play to load vars for managed_node1 15627 1726882479.62270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882479.66017: done with get_vars() 15627 1726882479.66044: done getting variables 15627 1726882479.66106: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882479.66219: variable 'profile' from source: play vars 15627 1726882479.66223: variable 'interface' from source: set_fact 15627 1726882479.66280: variable 'interface' from source: set_fact TASK [Assert that the profile is present - 'LSR-TST-br31'] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:34:39 -0400 (0:00:00.073) 0:00:19.414 ****** 15627 1726882479.66310: entering _queue_task() for managed_node1/assert 15627 1726882479.66604: worker is 1 (out of 1 available) 15627 1726882479.66617: exiting _queue_task() for managed_node1/assert 15627 1726882479.66630: done queuing things up, now waiting for results queue to drain 15627 1726882479.66631: waiting for pending results... 15627 1726882479.66898: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'LSR-TST-br31' 15627 1726882479.67025: in run() - task 0e448fcc-3ce9-2847-7723-000000000260 15627 1726882479.67046: variable 'ansible_search_path' from source: unknown 15627 1726882479.67055: variable 'ansible_search_path' from source: unknown 15627 1726882479.67100: calling self._execute() 15627 1726882479.67194: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.67204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.67217: variable 'omit' from source: magic vars 15627 1726882479.67569: variable 'ansible_distribution_major_version' from source: facts 15627 1726882479.67587: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882479.67600: variable 'omit' from source: magic vars 15627 1726882479.67650: variable 'omit' from source: magic vars 15627 1726882479.67754: variable 'profile' from source: play vars 15627 1726882479.67767: variable 'interface' from source: set_fact 15627 1726882479.67834: variable 'interface' from source: set_fact 15627 1726882479.67856: variable 'omit' from source: magic vars 15627 1726882479.67950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882479.68073: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882479.68126: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882479.68211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882479.68253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882479.68356: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882479.68389: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.68397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.68608: Set connection var ansible_timeout to 10 15627 1726882479.68622: Set connection var ansible_shell_executable to /bin/sh 15627 1726882479.68630: Set connection var ansible_connection to ssh 15627 1726882479.68644: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882479.68654: Set connection var ansible_pipelining to False 15627 1726882479.68662: Set connection var ansible_shell_type to sh 15627 1726882479.68693: variable 'ansible_shell_executable' from source: unknown 15627 1726882479.68702: variable 'ansible_connection' from source: unknown 15627 1726882479.68709: variable 'ansible_module_compression' from source: unknown 15627 1726882479.68716: variable 'ansible_shell_type' from source: unknown 15627 1726882479.68723: variable 'ansible_shell_executable' from source: unknown 15627 1726882479.68730: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.68738: variable 'ansible_pipelining' from source: unknown 15627 1726882479.68756: variable 'ansible_timeout' from source: unknown 15627 1726882479.68816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.69185: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882479.69250: variable 'omit' from source: magic vars 15627 1726882479.69262: starting attempt loop 15627 1726882479.69272: running the handler 15627 1726882479.69485: variable 'lsr_net_profile_exists' from source: set_fact 15627 1726882479.69496: Evaluated conditional (lsr_net_profile_exists): True 15627 1726882479.69511: handler run complete 15627 1726882479.69531: attempt loop complete, returning result 15627 1726882479.69537: _execute() done 15627 1726882479.69543: dumping result to json 15627 1726882479.69550: done dumping result, returning 15627 1726882479.69561: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'LSR-TST-br31' [0e448fcc-3ce9-2847-7723-000000000260] 15627 1726882479.69587: sending task result for task 0e448fcc-3ce9-2847-7723-000000000260 15627 1726882479.69711: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000260 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15627 1726882479.69803: no more pending results, returning what we have 15627 1726882479.69806: results queue empty 15627 1726882479.69808: checking for any_errors_fatal 15627 1726882479.69816: done checking for any_errors_fatal 15627 1726882479.69817: checking for max_fail_percentage 15627 1726882479.69819: done checking for max_fail_percentage 15627 1726882479.69820: checking to see if all hosts have failed and the running result is not ok 15627 1726882479.69822: done checking to see if all hosts have failed 15627 1726882479.69822: getting the remaining hosts for this loop 15627 1726882479.69828: done getting the remaining hosts for this loop 15627 1726882479.69836: getting the next task for host managed_node1 15627 1726882479.69844: done getting next task for host managed_node1 15627 1726882479.69846: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 15627 1726882479.69849: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882479.69854: getting variables 15627 1726882479.69856: in VariableManager get_vars() 15627 1726882479.69888: Calling all_inventory to load vars for managed_node1 15627 1726882479.69891: Calling groups_inventory to load vars for managed_node1 15627 1726882479.69895: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882479.69907: Calling all_plugins_play to load vars for managed_node1 15627 1726882479.69910: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882479.69914: Calling groups_plugins_play to load vars for managed_node1 15627 1726882479.71192: WORKER PROCESS EXITING 15627 1726882479.72121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882479.74322: done with get_vars() 15627 1726882479.74346: done getting variables 15627 1726882479.74457: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882479.74647: variable 'profile' from source: play vars 15627 1726882479.74650: variable 'interface' from source: set_fact 15627 1726882479.74743: variable 'interface' from source: set_fact TASK [Assert that the ansible managed comment is present in 'LSR-TST-br31'] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:34:39 -0400 (0:00:00.085) 0:00:19.500 ****** 15627 1726882479.74845: entering _queue_task() for managed_node1/assert 15627 1726882479.75222: worker is 1 (out of 1 available) 15627 1726882479.75235: exiting _queue_task() for managed_node1/assert 15627 1726882479.75250: done queuing things up, now waiting for results queue to drain 15627 1726882479.75277: waiting for pending results... 15627 1726882479.75699: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' 15627 1726882479.75836: in run() - task 0e448fcc-3ce9-2847-7723-000000000261 15627 1726882479.75862: variable 'ansible_search_path' from source: unknown 15627 1726882479.75874: variable 'ansible_search_path' from source: unknown 15627 1726882479.75943: calling self._execute() 15627 1726882479.76236: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.76248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.76271: variable 'omit' from source: magic vars 15627 1726882479.77202: variable 'ansible_distribution_major_version' from source: facts 15627 1726882479.77241: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882479.77256: variable 'omit' from source: magic vars 15627 1726882479.77341: variable 'omit' from source: magic vars 15627 1726882479.77448: variable 'profile' from source: play vars 15627 1726882479.77460: variable 'interface' from source: set_fact 15627 1726882479.77526: variable 'interface' from source: set_fact 15627 1726882479.77553: variable 'omit' from source: magic vars 15627 1726882479.77601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882479.77640: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882479.77675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882479.77698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882479.77729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882479.77766: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882479.77776: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.77784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.77890: Set connection var ansible_timeout to 10 15627 1726882479.77905: Set connection var ansible_shell_executable to /bin/sh 15627 1726882479.77916: Set connection var ansible_connection to ssh 15627 1726882479.77926: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882479.77936: Set connection var ansible_pipelining to False 15627 1726882479.77944: Set connection var ansible_shell_type to sh 15627 1726882479.77978: variable 'ansible_shell_executable' from source: unknown 15627 1726882479.77992: variable 'ansible_connection' from source: unknown 15627 1726882479.78000: variable 'ansible_module_compression' from source: unknown 15627 1726882479.78008: variable 'ansible_shell_type' from source: unknown 15627 1726882479.78015: variable 'ansible_shell_executable' from source: unknown 15627 1726882479.78023: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.78031: variable 'ansible_pipelining' from source: unknown 15627 1726882479.78038: variable 'ansible_timeout' from source: unknown 15627 1726882479.78046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.78367: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882479.78412: variable 'omit' from source: magic vars 15627 1726882479.78445: starting attempt loop 15627 1726882479.78468: running the handler 15627 1726882479.78647: variable 'lsr_net_profile_ansible_managed' from source: set_fact 15627 1726882479.78691: Evaluated conditional (lsr_net_profile_ansible_managed): True 15627 1726882479.78715: handler run complete 15627 1726882479.78747: attempt loop complete, returning result 15627 1726882479.78757: _execute() done 15627 1726882479.78766: dumping result to json 15627 1726882479.78775: done dumping result, returning 15627 1726882479.78785: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' [0e448fcc-3ce9-2847-7723-000000000261] 15627 1726882479.78794: sending task result for task 0e448fcc-3ce9-2847-7723-000000000261 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15627 1726882479.78941: no more pending results, returning what we have 15627 1726882479.78944: results queue empty 15627 1726882479.78946: checking for any_errors_fatal 15627 1726882479.78951: done checking for any_errors_fatal 15627 1726882479.78952: checking for max_fail_percentage 15627 1726882479.78955: done checking for max_fail_percentage 15627 1726882479.78956: checking to see if all hosts have failed and the running result is not ok 15627 1726882479.78958: done checking to see if all hosts have failed 15627 1726882479.78959: getting the remaining hosts for this loop 15627 1726882479.78960: done getting the remaining hosts for this loop 15627 1726882479.78966: getting the next task for host managed_node1 15627 1726882479.78974: done getting next task for host managed_node1 15627 1726882479.78977: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 15627 1726882479.78979: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882479.78985: getting variables 15627 1726882479.78987: in VariableManager get_vars() 15627 1726882479.79022: Calling all_inventory to load vars for managed_node1 15627 1726882479.79025: Calling groups_inventory to load vars for managed_node1 15627 1726882479.79029: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882479.79040: Calling all_plugins_play to load vars for managed_node1 15627 1726882479.79043: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882479.79046: Calling groups_plugins_play to load vars for managed_node1 15627 1726882479.80067: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000261 15627 1726882479.80071: WORKER PROCESS EXITING 15627 1726882479.81260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882479.83036: done with get_vars() 15627 1726882479.83051: done getting variables 15627 1726882479.83097: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882479.83186: variable 'profile' from source: play vars 15627 1726882479.83189: variable 'interface' from source: set_fact 15627 1726882479.83238: variable 'interface' from source: set_fact TASK [Assert that the fingerprint comment is present in LSR-TST-br31] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:34:39 -0400 (0:00:00.084) 0:00:19.584 ****** 15627 1726882479.83286: entering _queue_task() for managed_node1/assert 15627 1726882479.83537: worker is 1 (out of 1 available) 15627 1726882479.83549: exiting _queue_task() for managed_node1/assert 15627 1726882479.83561: done queuing things up, now waiting for results queue to drain 15627 1726882479.83562: waiting for pending results... 15627 1726882479.83770: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 15627 1726882479.83831: in run() - task 0e448fcc-3ce9-2847-7723-000000000262 15627 1726882479.83844: variable 'ansible_search_path' from source: unknown 15627 1726882479.83852: variable 'ansible_search_path' from source: unknown 15627 1726882479.83916: calling self._execute() 15627 1726882479.84049: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.84053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.84077: variable 'omit' from source: magic vars 15627 1726882479.84483: variable 'ansible_distribution_major_version' from source: facts 15627 1726882479.84493: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882479.84499: variable 'omit' from source: magic vars 15627 1726882479.84531: variable 'omit' from source: magic vars 15627 1726882479.84621: variable 'profile' from source: play vars 15627 1726882479.84633: variable 'interface' from source: set_fact 15627 1726882479.84698: variable 'interface' from source: set_fact 15627 1726882479.84714: variable 'omit' from source: magic vars 15627 1726882479.84746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882479.84779: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882479.84808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882479.84826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882479.84861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882479.84916: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882479.84920: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.84923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.85124: Set connection var ansible_timeout to 10 15627 1726882479.85157: Set connection var ansible_shell_executable to /bin/sh 15627 1726882479.85169: Set connection var ansible_connection to ssh 15627 1726882479.85178: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882479.85191: Set connection var ansible_pipelining to False 15627 1726882479.85197: Set connection var ansible_shell_type to sh 15627 1726882479.85235: variable 'ansible_shell_executable' from source: unknown 15627 1726882479.85243: variable 'ansible_connection' from source: unknown 15627 1726882479.85249: variable 'ansible_module_compression' from source: unknown 15627 1726882479.85257: variable 'ansible_shell_type' from source: unknown 15627 1726882479.85269: variable 'ansible_shell_executable' from source: unknown 15627 1726882479.85293: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882479.85324: variable 'ansible_pipelining' from source: unknown 15627 1726882479.85341: variable 'ansible_timeout' from source: unknown 15627 1726882479.85349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882479.85563: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882479.85580: variable 'omit' from source: magic vars 15627 1726882479.85604: starting attempt loop 15627 1726882479.85611: running the handler 15627 1726882479.85800: variable 'lsr_net_profile_fingerprint' from source: set_fact 15627 1726882479.85811: Evaluated conditional (lsr_net_profile_fingerprint): True 15627 1726882479.85821: handler run complete 15627 1726882479.85838: attempt loop complete, returning result 15627 1726882479.85848: _execute() done 15627 1726882479.85857: dumping result to json 15627 1726882479.85871: done dumping result, returning 15627 1726882479.85887: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 [0e448fcc-3ce9-2847-7723-000000000262] 15627 1726882479.85891: sending task result for task 0e448fcc-3ce9-2847-7723-000000000262 15627 1726882479.85988: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000262 15627 1726882479.85991: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15627 1726882479.86043: no more pending results, returning what we have 15627 1726882479.86046: results queue empty 15627 1726882479.86047: checking for any_errors_fatal 15627 1726882479.86053: done checking for any_errors_fatal 15627 1726882479.86054: checking for max_fail_percentage 15627 1726882479.86056: done checking for max_fail_percentage 15627 1726882479.86057: checking to see if all hosts have failed and the running result is not ok 15627 1726882479.86059: done checking to see if all hosts have failed 15627 1726882479.86059: getting the remaining hosts for this loop 15627 1726882479.86061: done getting the remaining hosts for this loop 15627 1726882479.86066: getting the next task for host managed_node1 15627 1726882479.86075: done getting next task for host managed_node1 15627 1726882479.86077: ^ task is: TASK: meta (flush_handlers) 15627 1726882479.86088: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882479.86116: getting variables 15627 1726882479.86118: in VariableManager get_vars() 15627 1726882479.86154: Calling all_inventory to load vars for managed_node1 15627 1726882479.86157: Calling groups_inventory to load vars for managed_node1 15627 1726882479.86161: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882479.86198: Calling all_plugins_play to load vars for managed_node1 15627 1726882479.86202: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882479.86206: Calling groups_plugins_play to load vars for managed_node1 15627 1726882479.87822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882479.88766: done with get_vars() 15627 1726882479.88780: done getting variables 15627 1726882479.88827: in VariableManager get_vars() 15627 1726882479.88834: Calling all_inventory to load vars for managed_node1 15627 1726882479.88836: Calling groups_inventory to load vars for managed_node1 15627 1726882479.88837: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882479.88840: Calling all_plugins_play to load vars for managed_node1 15627 1726882479.88842: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882479.88843: Calling groups_plugins_play to load vars for managed_node1 15627 1726882479.90884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882479.93167: done with get_vars() 15627 1726882479.93220: done queuing things up, now waiting for results queue to drain 15627 1726882479.93222: results queue empty 15627 1726882479.93222: checking for any_errors_fatal 15627 1726882479.93224: done checking for any_errors_fatal 15627 1726882479.93225: checking for max_fail_percentage 15627 1726882479.93226: done checking for max_fail_percentage 15627 1726882479.93230: checking to see if all hosts have failed and the running result is not ok 15627 1726882479.93230: done checking to see if all hosts have failed 15627 1726882479.93231: getting the remaining hosts for this loop 15627 1726882479.93232: done getting the remaining hosts for this loop 15627 1726882479.93233: getting the next task for host managed_node1 15627 1726882479.93236: done getting next task for host managed_node1 15627 1726882479.93237: ^ task is: TASK: meta (flush_handlers) 15627 1726882479.93238: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882479.93240: getting variables 15627 1726882479.93240: in VariableManager get_vars() 15627 1726882479.93248: Calling all_inventory to load vars for managed_node1 15627 1726882479.93250: Calling groups_inventory to load vars for managed_node1 15627 1726882479.93261: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882479.93269: Calling all_plugins_play to load vars for managed_node1 15627 1726882479.93274: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882479.93278: Calling groups_plugins_play to load vars for managed_node1 15627 1726882479.97857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882480.01115: done with get_vars() 15627 1726882480.01139: done getting variables 15627 1726882480.01274: in VariableManager get_vars() 15627 1726882480.01304: Calling all_inventory to load vars for managed_node1 15627 1726882480.01306: Calling groups_inventory to load vars for managed_node1 15627 1726882480.01309: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882480.01331: Calling all_plugins_play to load vars for managed_node1 15627 1726882480.01380: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882480.01384: Calling groups_plugins_play to load vars for managed_node1 15627 1726882480.09198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882480.11741: done with get_vars() 15627 1726882480.11771: done queuing things up, now waiting for results queue to drain 15627 1726882480.11773: results queue empty 15627 1726882480.11774: checking for any_errors_fatal 15627 1726882480.11775: done checking for any_errors_fatal 15627 1726882480.11776: checking for max_fail_percentage 15627 1726882480.11777: done checking for max_fail_percentage 15627 1726882480.11778: checking to see if all hosts have failed and the running result is not ok 15627 1726882480.11779: done checking to see if all hosts have failed 15627 1726882480.11780: getting the remaining hosts for this loop 15627 1726882480.11781: done getting the remaining hosts for this loop 15627 1726882480.11784: getting the next task for host managed_node1 15627 1726882480.11787: done getting next task for host managed_node1 15627 1726882480.11788: ^ task is: None 15627 1726882480.11789: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882480.11790: done queuing things up, now waiting for results queue to drain 15627 1726882480.11792: results queue empty 15627 1726882480.11792: checking for any_errors_fatal 15627 1726882480.11793: done checking for any_errors_fatal 15627 1726882480.11794: checking for max_fail_percentage 15627 1726882480.11795: done checking for max_fail_percentage 15627 1726882480.11795: checking to see if all hosts have failed and the running result is not ok 15627 1726882480.11796: done checking to see if all hosts have failed 15627 1726882480.11797: getting the next task for host managed_node1 15627 1726882480.11800: done getting next task for host managed_node1 15627 1726882480.11801: ^ task is: None 15627 1726882480.11802: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882480.11836: in VariableManager get_vars() 15627 1726882480.11857: done with get_vars() 15627 1726882480.11866: in VariableManager get_vars() 15627 1726882480.11881: done with get_vars() 15627 1726882480.11887: variable 'omit' from source: magic vars 15627 1726882480.11980: variable 'profile' from source: play vars 15627 1726882480.12080: in VariableManager get_vars() 15627 1726882480.12094: done with get_vars() 15627 1726882480.12115: variable 'omit' from source: magic vars 15627 1726882480.12178: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 15627 1726882480.12834: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15627 1726882480.12855: getting the remaining hosts for this loop 15627 1726882480.12857: done getting the remaining hosts for this loop 15627 1726882480.12859: getting the next task for host managed_node1 15627 1726882480.12862: done getting next task for host managed_node1 15627 1726882480.12865: ^ task is: TASK: Gathering Facts 15627 1726882480.12867: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882480.12869: getting variables 15627 1726882480.12870: in VariableManager get_vars() 15627 1726882480.12880: Calling all_inventory to load vars for managed_node1 15627 1726882480.12882: Calling groups_inventory to load vars for managed_node1 15627 1726882480.12884: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882480.12889: Calling all_plugins_play to load vars for managed_node1 15627 1726882480.12892: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882480.12894: Calling groups_plugins_play to load vars for managed_node1 15627 1726882480.14774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882480.16433: done with get_vars() 15627 1726882480.16453: done getting variables 15627 1726882480.16496: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 21:34:40 -0400 (0:00:00.332) 0:00:19.916 ****** 15627 1726882480.16517: entering _queue_task() for managed_node1/gather_facts 15627 1726882480.17207: worker is 1 (out of 1 available) 15627 1726882480.17218: exiting _queue_task() for managed_node1/gather_facts 15627 1726882480.17229: done queuing things up, now waiting for results queue to drain 15627 1726882480.17230: waiting for pending results... 15627 1726882480.18127: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15627 1726882480.18305: in run() - task 0e448fcc-3ce9-2847-7723-0000000002b5 15627 1726882480.18440: variable 'ansible_search_path' from source: unknown 15627 1726882480.18482: calling self._execute() 15627 1726882480.18707: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882480.18720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882480.18737: variable 'omit' from source: magic vars 15627 1726882480.20384: variable 'ansible_distribution_major_version' from source: facts 15627 1726882480.20402: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882480.20413: variable 'omit' from source: magic vars 15627 1726882480.20440: variable 'omit' from source: magic vars 15627 1726882480.20485: variable 'omit' from source: magic vars 15627 1726882480.20529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882480.20571: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882480.20595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882480.20688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882480.20705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882480.20737: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882480.20746: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882480.20756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882480.20967: Set connection var ansible_timeout to 10 15627 1726882480.20982: Set connection var ansible_shell_executable to /bin/sh 15627 1726882480.21078: Set connection var ansible_connection to ssh 15627 1726882480.21088: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882480.21096: Set connection var ansible_pipelining to False 15627 1726882480.21102: Set connection var ansible_shell_type to sh 15627 1726882480.21129: variable 'ansible_shell_executable' from source: unknown 15627 1726882480.21137: variable 'ansible_connection' from source: unknown 15627 1726882480.21144: variable 'ansible_module_compression' from source: unknown 15627 1726882480.21152: variable 'ansible_shell_type' from source: unknown 15627 1726882480.21161: variable 'ansible_shell_executable' from source: unknown 15627 1726882480.21170: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882480.21178: variable 'ansible_pipelining' from source: unknown 15627 1726882480.21185: variable 'ansible_timeout' from source: unknown 15627 1726882480.21192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882480.21630: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882480.21645: variable 'omit' from source: magic vars 15627 1726882480.21657: starting attempt loop 15627 1726882480.21666: running the handler 15627 1726882480.21686: variable 'ansible_facts' from source: unknown 15627 1726882480.21707: _low_level_execute_command(): starting 15627 1726882480.21718: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882480.23224: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882480.23229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882480.23273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882480.23276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882480.23278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882480.23340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882480.23352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882480.23479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882480.25162: stdout chunk (state=3): >>>/root <<< 15627 1726882480.25262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882480.25339: stderr chunk (state=3): >>><<< 15627 1726882480.25343: stdout chunk (state=3): >>><<< 15627 1726882480.25462: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882480.25469: _low_level_execute_command(): starting 15627 1726882480.25474: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882480.2536628-16494-223720435557048 `" && echo ansible-tmp-1726882480.2536628-16494-223720435557048="` echo /root/.ansible/tmp/ansible-tmp-1726882480.2536628-16494-223720435557048 `" ) && sleep 0' 15627 1726882480.27953: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882480.27960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882480.27994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882480.28006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882480.28009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882480.28169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882480.28241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882480.28245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882480.28356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882480.30247: stdout chunk (state=3): >>>ansible-tmp-1726882480.2536628-16494-223720435557048=/root/.ansible/tmp/ansible-tmp-1726882480.2536628-16494-223720435557048 <<< 15627 1726882480.30362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882480.30435: stderr chunk (state=3): >>><<< 15627 1726882480.30438: stdout chunk (state=3): >>><<< 15627 1726882480.30572: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882480.2536628-16494-223720435557048=/root/.ansible/tmp/ansible-tmp-1726882480.2536628-16494-223720435557048 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882480.30576: variable 'ansible_module_compression' from source: unknown 15627 1726882480.30579: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15627 1726882480.30681: variable 'ansible_facts' from source: unknown 15627 1726882480.30794: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882480.2536628-16494-223720435557048/AnsiballZ_setup.py 15627 1726882480.31073: Sending initial data 15627 1726882480.31084: Sent initial data (154 bytes) 15627 1726882480.32118: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882480.32129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882480.32142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882480.32162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882480.32210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882480.32225: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882480.32239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882480.32256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882480.32272: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882480.32283: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882480.32294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882480.32307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882480.32323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882480.32337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882480.32347: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882480.32359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882480.32442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882480.32469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882480.32490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882480.32613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882480.34332: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882480.34420: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882480.34514: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmptxst5d8l /root/.ansible/tmp/ansible-tmp-1726882480.2536628-16494-223720435557048/AnsiballZ_setup.py <<< 15627 1726882480.34601: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882480.38328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882480.38453: stderr chunk (state=3): >>><<< 15627 1726882480.38457: stdout chunk (state=3): >>><<< 15627 1726882480.38459: done transferring module to remote 15627 1726882480.38462: _low_level_execute_command(): starting 15627 1726882480.38470: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882480.2536628-16494-223720435557048/ /root/.ansible/tmp/ansible-tmp-1726882480.2536628-16494-223720435557048/AnsiballZ_setup.py && sleep 0' 15627 1726882480.39844: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882480.39853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882480.39867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882480.39883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882480.39922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882480.39978: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882480.39989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882480.40003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882480.40010: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882480.40019: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882480.40026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882480.40036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882480.40047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882480.40054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882480.40066: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882480.40180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882480.40251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882480.40266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882480.40281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882480.40484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882480.42277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882480.42281: stdout chunk (state=3): >>><<< 15627 1726882480.42287: stderr chunk (state=3): >>><<< 15627 1726882480.42311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882480.42314: _low_level_execute_command(): starting 15627 1726882480.42319: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882480.2536628-16494-223720435557048/AnsiballZ_setup.py && sleep 0' 15627 1726882480.43919: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882480.43922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882480.43966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882480.43970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882480.43972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882480.43975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882480.44138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882480.44230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882480.44237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882480.44351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882480.96000: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_local": {}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_loadavg": {"1m": 0.5, "5m": 0.38, "15m": 0.2}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["LSR-TST-br31", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "e6:c3:a0:67:8a:2a", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixe<<< 15627 1726882480.96011: stdout chunk (state=3): >>>d]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "40", "epoch": "1726882480", "epoch_int": "1726882480", "date": "2024-09-20", "time": "21:34:40", "iso8601_micro": "2024-09-21T01:34:40.737920Z", "iso8601": "2024-09-21T01:34:40Z", "iso8601_basic": "20240920T213440737920", "iso8601_basic_short": "20240920T213440", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2816, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 716, "free": 2816}, "nocache": {"free": 3277, "used": 255}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 638, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241410048, "block_size": 4096, "block_total": 65519355, "block_available": 64512063, "block_used": 1007292, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15627 1726882480.97809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882480.97957: stderr chunk (state=3): >>><<< 15627 1726882480.97972: stdout chunk (state=3): >>><<< 15627 1726882480.98280: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_local": {}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_loadavg": {"1m": 0.5, "5m": 0.38, "15m": 0.2}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["LSR-TST-br31", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "e6:c3:a0:67:8a:2a", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "40", "epoch": "1726882480", "epoch_int": "1726882480", "date": "2024-09-20", "time": "21:34:40", "iso8601_micro": "2024-09-21T01:34:40.737920Z", "iso8601": "2024-09-21T01:34:40Z", "iso8601_basic": "20240920T213440737920", "iso8601_basic_short": "20240920T213440", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2816, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 716, "free": 2816}, "nocache": {"free": 3277, "used": 255}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 638, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241410048, "block_size": 4096, "block_total": 65519355, "block_available": 64512063, "block_used": 1007292, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882480.99091: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882480.2536628-16494-223720435557048/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882480.99123: _low_level_execute_command(): starting 15627 1726882480.99143: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882480.2536628-16494-223720435557048/ > /dev/null 2>&1 && sleep 0' 15627 1726882481.00200: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882481.00224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882481.00250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882481.00277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882481.00331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882481.00334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882481.00336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882481.00396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882481.00399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882481.00510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882481.02344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882481.02446: stderr chunk (state=3): >>><<< 15627 1726882481.02449: stdout chunk (state=3): >>><<< 15627 1726882481.02496: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882481.02531: handler run complete 15627 1726882481.02741: variable 'ansible_facts' from source: unknown 15627 1726882481.02883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882481.03407: variable 'ansible_facts' from source: unknown 15627 1726882481.03533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882481.03710: attempt loop complete, returning result 15627 1726882481.03721: _execute() done 15627 1726882481.03728: dumping result to json 15627 1726882481.03825: done dumping result, returning 15627 1726882481.03841: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-2847-7723-0000000002b5] 15627 1726882481.03850: sending task result for task 0e448fcc-3ce9-2847-7723-0000000002b5 ok: [managed_node1] 15627 1726882481.05310: no more pending results, returning what we have 15627 1726882481.05313: results queue empty 15627 1726882481.05314: checking for any_errors_fatal 15627 1726882481.05316: done checking for any_errors_fatal 15627 1726882481.05317: checking for max_fail_percentage 15627 1726882481.05318: done checking for max_fail_percentage 15627 1726882481.05319: checking to see if all hosts have failed and the running result is not ok 15627 1726882481.05321: done checking to see if all hosts have failed 15627 1726882481.05321: getting the remaining hosts for this loop 15627 1726882481.05323: done getting the remaining hosts for this loop 15627 1726882481.05327: getting the next task for host managed_node1 15627 1726882481.05334: done getting next task for host managed_node1 15627 1726882481.05336: ^ task is: TASK: meta (flush_handlers) 15627 1726882481.05338: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882481.05342: getting variables 15627 1726882481.05344: in VariableManager get_vars() 15627 1726882481.05571: Calling all_inventory to load vars for managed_node1 15627 1726882481.05574: Calling groups_inventory to load vars for managed_node1 15627 1726882481.05577: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882481.05776: Calling all_plugins_play to load vars for managed_node1 15627 1726882481.05780: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882481.05784: Calling groups_plugins_play to load vars for managed_node1 15627 1726882481.06753: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000002b5 15627 1726882481.06756: WORKER PROCESS EXITING 15627 1726882481.07748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882481.09584: done with get_vars() 15627 1726882481.09610: done getting variables 15627 1726882481.09678: in VariableManager get_vars() 15627 1726882481.09688: Calling all_inventory to load vars for managed_node1 15627 1726882481.09690: Calling groups_inventory to load vars for managed_node1 15627 1726882481.09691: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882481.09694: Calling all_plugins_play to load vars for managed_node1 15627 1726882481.09696: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882481.09697: Calling groups_plugins_play to load vars for managed_node1 15627 1726882481.10588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882481.11893: done with get_vars() 15627 1726882481.11920: done queuing things up, now waiting for results queue to drain 15627 1726882481.11924: results queue empty 15627 1726882481.11925: checking for any_errors_fatal 15627 1726882481.11928: done checking for any_errors_fatal 15627 1726882481.11930: checking for max_fail_percentage 15627 1726882481.11934: done checking for max_fail_percentage 15627 1726882481.11935: checking to see if all hosts have failed and the running result is not ok 15627 1726882481.11938: done checking to see if all hosts have failed 15627 1726882481.11939: getting the remaining hosts for this loop 15627 1726882481.11939: done getting the remaining hosts for this loop 15627 1726882481.11942: getting the next task for host managed_node1 15627 1726882481.11954: done getting next task for host managed_node1 15627 1726882481.11959: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15627 1726882481.11963: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882481.11974: getting variables 15627 1726882481.11975: in VariableManager get_vars() 15627 1726882481.12002: Calling all_inventory to load vars for managed_node1 15627 1726882481.12004: Calling groups_inventory to load vars for managed_node1 15627 1726882481.12006: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882481.12009: Calling all_plugins_play to load vars for managed_node1 15627 1726882481.12010: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882481.12012: Calling groups_plugins_play to load vars for managed_node1 15627 1726882481.12943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882481.13890: done with get_vars() 15627 1726882481.13903: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:34:41 -0400 (0:00:00.974) 0:00:20.891 ****** 15627 1726882481.13957: entering _queue_task() for managed_node1/include_tasks 15627 1726882481.14175: worker is 1 (out of 1 available) 15627 1726882481.14188: exiting _queue_task() for managed_node1/include_tasks 15627 1726882481.14199: done queuing things up, now waiting for results queue to drain 15627 1726882481.14201: waiting for pending results... 15627 1726882481.14373: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15627 1726882481.14443: in run() - task 0e448fcc-3ce9-2847-7723-00000000003a 15627 1726882481.14482: variable 'ansible_search_path' from source: unknown 15627 1726882481.14497: variable 'ansible_search_path' from source: unknown 15627 1726882481.14532: calling self._execute() 15627 1726882481.14617: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882481.14623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882481.14632: variable 'omit' from source: magic vars 15627 1726882481.14979: variable 'ansible_distribution_major_version' from source: facts 15627 1726882481.14992: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882481.15004: _execute() done 15627 1726882481.15012: dumping result to json 15627 1726882481.15019: done dumping result, returning 15627 1726882481.15026: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-2847-7723-00000000003a] 15627 1726882481.15031: sending task result for task 0e448fcc-3ce9-2847-7723-00000000003a 15627 1726882481.15203: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000003a 15627 1726882481.15206: WORKER PROCESS EXITING 15627 1726882481.15253: no more pending results, returning what we have 15627 1726882481.15260: in VariableManager get_vars() 15627 1726882481.15294: Calling all_inventory to load vars for managed_node1 15627 1726882481.15297: Calling groups_inventory to load vars for managed_node1 15627 1726882481.15299: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882481.15307: Calling all_plugins_play to load vars for managed_node1 15627 1726882481.15309: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882481.15311: Calling groups_plugins_play to load vars for managed_node1 15627 1726882481.16548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882481.17589: done with get_vars() 15627 1726882481.17602: variable 'ansible_search_path' from source: unknown 15627 1726882481.17603: variable 'ansible_search_path' from source: unknown 15627 1726882481.17622: we have included files to process 15627 1726882481.17623: generating all_blocks data 15627 1726882481.17624: done generating all_blocks data 15627 1726882481.17625: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15627 1726882481.17626: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15627 1726882481.17628: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15627 1726882481.17987: done processing included file 15627 1726882481.17988: iterating over new_blocks loaded from include file 15627 1726882481.17990: in VariableManager get_vars() 15627 1726882481.18002: done with get_vars() 15627 1726882481.18003: filtering new block on tags 15627 1726882481.18012: done filtering new block on tags 15627 1726882481.18014: in VariableManager get_vars() 15627 1726882481.18025: done with get_vars() 15627 1726882481.18026: filtering new block on tags 15627 1726882481.18036: done filtering new block on tags 15627 1726882481.18037: in VariableManager get_vars() 15627 1726882481.18048: done with get_vars() 15627 1726882481.18049: filtering new block on tags 15627 1726882481.18060: done filtering new block on tags 15627 1726882481.18062: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 15627 1726882481.18067: extending task lists for all hosts with included blocks 15627 1726882481.18268: done extending task lists 15627 1726882481.18269: done processing included files 15627 1726882481.18270: results queue empty 15627 1726882481.18270: checking for any_errors_fatal 15627 1726882481.18271: done checking for any_errors_fatal 15627 1726882481.18271: checking for max_fail_percentage 15627 1726882481.18272: done checking for max_fail_percentage 15627 1726882481.18273: checking to see if all hosts have failed and the running result is not ok 15627 1726882481.18273: done checking to see if all hosts have failed 15627 1726882481.18274: getting the remaining hosts for this loop 15627 1726882481.18275: done getting the remaining hosts for this loop 15627 1726882481.18277: getting the next task for host managed_node1 15627 1726882481.18280: done getting next task for host managed_node1 15627 1726882481.18282: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15627 1726882481.18284: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882481.18289: getting variables 15627 1726882481.18290: in VariableManager get_vars() 15627 1726882481.18299: Calling all_inventory to load vars for managed_node1 15627 1726882481.18300: Calling groups_inventory to load vars for managed_node1 15627 1726882481.18301: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882481.18304: Calling all_plugins_play to load vars for managed_node1 15627 1726882481.18306: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882481.18307: Calling groups_plugins_play to load vars for managed_node1 15627 1726882481.19045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882481.20827: done with get_vars() 15627 1726882481.20843: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:34:41 -0400 (0:00:00.069) 0:00:20.960 ****** 15627 1726882481.20934: entering _queue_task() for managed_node1/setup 15627 1726882481.21244: worker is 1 (out of 1 available) 15627 1726882481.21260: exiting _queue_task() for managed_node1/setup 15627 1726882481.21273: done queuing things up, now waiting for results queue to drain 15627 1726882481.21275: waiting for pending results... 15627 1726882481.21540: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15627 1726882481.21671: in run() - task 0e448fcc-3ce9-2847-7723-0000000002f6 15627 1726882481.21689: variable 'ansible_search_path' from source: unknown 15627 1726882481.21696: variable 'ansible_search_path' from source: unknown 15627 1726882481.21736: calling self._execute() 15627 1726882481.21836: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882481.21848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882481.21866: variable 'omit' from source: magic vars 15627 1726882481.22235: variable 'ansible_distribution_major_version' from source: facts 15627 1726882481.22256: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882481.22469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882481.24202: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882481.24247: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882481.24276: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882481.24301: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882481.24322: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882481.24382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882481.24402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882481.24419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882481.24448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882481.24463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882481.24500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882481.24516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882481.24532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882481.24568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882481.24578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882481.24684: variable '__network_required_facts' from source: role '' defaults 15627 1726882481.24691: variable 'ansible_facts' from source: unknown 15627 1726882481.25268: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15627 1726882481.25278: when evaluation is False, skipping this task 15627 1726882481.25286: _execute() done 15627 1726882481.25293: dumping result to json 15627 1726882481.25299: done dumping result, returning 15627 1726882481.25310: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-2847-7723-0000000002f6] 15627 1726882481.25319: sending task result for task 0e448fcc-3ce9-2847-7723-0000000002f6 15627 1726882481.25415: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000002f6 15627 1726882481.25422: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882481.25504: no more pending results, returning what we have 15627 1726882481.25506: results queue empty 15627 1726882481.25507: checking for any_errors_fatal 15627 1726882481.25509: done checking for any_errors_fatal 15627 1726882481.25510: checking for max_fail_percentage 15627 1726882481.25511: done checking for max_fail_percentage 15627 1726882481.25512: checking to see if all hosts have failed and the running result is not ok 15627 1726882481.25513: done checking to see if all hosts have failed 15627 1726882481.25513: getting the remaining hosts for this loop 15627 1726882481.25515: done getting the remaining hosts for this loop 15627 1726882481.25518: getting the next task for host managed_node1 15627 1726882481.25527: done getting next task for host managed_node1 15627 1726882481.25530: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15627 1726882481.25532: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882481.25545: getting variables 15627 1726882481.25546: in VariableManager get_vars() 15627 1726882481.25582: Calling all_inventory to load vars for managed_node1 15627 1726882481.25585: Calling groups_inventory to load vars for managed_node1 15627 1726882481.25587: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882481.25596: Calling all_plugins_play to load vars for managed_node1 15627 1726882481.25599: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882481.25601: Calling groups_plugins_play to load vars for managed_node1 15627 1726882481.27321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882481.29112: done with get_vars() 15627 1726882481.29136: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:34:41 -0400 (0:00:00.082) 0:00:21.043 ****** 15627 1726882481.29220: entering _queue_task() for managed_node1/stat 15627 1726882481.29519: worker is 1 (out of 1 available) 15627 1726882481.29533: exiting _queue_task() for managed_node1/stat 15627 1726882481.29546: done queuing things up, now waiting for results queue to drain 15627 1726882481.29547: waiting for pending results... 15627 1726882481.29912: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15627 1726882481.30054: in run() - task 0e448fcc-3ce9-2847-7723-0000000002f8 15627 1726882481.30077: variable 'ansible_search_path' from source: unknown 15627 1726882481.30085: variable 'ansible_search_path' from source: unknown 15627 1726882481.30131: calling self._execute() 15627 1726882481.30244: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882481.30256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882481.30274: variable 'omit' from source: magic vars 15627 1726882481.30766: variable 'ansible_distribution_major_version' from source: facts 15627 1726882481.30785: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882481.30948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882481.31219: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882481.31264: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882481.31303: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882481.31343: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882481.31447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882481.31479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882481.31508: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882481.31546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882481.31634: variable '__network_is_ostree' from source: set_fact 15627 1726882481.31649: Evaluated conditional (not __network_is_ostree is defined): False 15627 1726882481.31656: when evaluation is False, skipping this task 15627 1726882481.31662: _execute() done 15627 1726882481.31671: dumping result to json 15627 1726882481.31678: done dumping result, returning 15627 1726882481.31687: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-2847-7723-0000000002f8] 15627 1726882481.31696: sending task result for task 0e448fcc-3ce9-2847-7723-0000000002f8 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15627 1726882481.31835: no more pending results, returning what we have 15627 1726882481.31839: results queue empty 15627 1726882481.31840: checking for any_errors_fatal 15627 1726882481.31846: done checking for any_errors_fatal 15627 1726882481.31847: checking for max_fail_percentage 15627 1726882481.31849: done checking for max_fail_percentage 15627 1726882481.31850: checking to see if all hosts have failed and the running result is not ok 15627 1726882481.31851: done checking to see if all hosts have failed 15627 1726882481.31852: getting the remaining hosts for this loop 15627 1726882481.31854: done getting the remaining hosts for this loop 15627 1726882481.31858: getting the next task for host managed_node1 15627 1726882481.31867: done getting next task for host managed_node1 15627 1726882481.31871: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15627 1726882481.31874: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882481.31887: getting variables 15627 1726882481.31889: in VariableManager get_vars() 15627 1726882481.31926: Calling all_inventory to load vars for managed_node1 15627 1726882481.31929: Calling groups_inventory to load vars for managed_node1 15627 1726882481.31931: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882481.31942: Calling all_plugins_play to load vars for managed_node1 15627 1726882481.31945: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882481.31948: Calling groups_plugins_play to load vars for managed_node1 15627 1726882481.33027: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000002f8 15627 1726882481.33030: WORKER PROCESS EXITING 15627 1726882481.33635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882481.35617: done with get_vars() 15627 1726882481.35641: done getting variables 15627 1726882481.35708: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:34:41 -0400 (0:00:00.065) 0:00:21.109 ****** 15627 1726882481.35743: entering _queue_task() for managed_node1/set_fact 15627 1726882481.36044: worker is 1 (out of 1 available) 15627 1726882481.36056: exiting _queue_task() for managed_node1/set_fact 15627 1726882481.36071: done queuing things up, now waiting for results queue to drain 15627 1726882481.36073: waiting for pending results... 15627 1726882481.36351: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15627 1726882481.36487: in run() - task 0e448fcc-3ce9-2847-7723-0000000002f9 15627 1726882481.36505: variable 'ansible_search_path' from source: unknown 15627 1726882481.36515: variable 'ansible_search_path' from source: unknown 15627 1726882481.36553: calling self._execute() 15627 1726882481.36650: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882481.36661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882481.36677: variable 'omit' from source: magic vars 15627 1726882481.37041: variable 'ansible_distribution_major_version' from source: facts 15627 1726882481.37069: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882481.37236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882481.37527: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882481.37577: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882481.37624: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882481.37660: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882481.37753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882481.37783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882481.37816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882481.37849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882481.37942: variable '__network_is_ostree' from source: set_fact 15627 1726882481.37953: Evaluated conditional (not __network_is_ostree is defined): False 15627 1726882481.37961: when evaluation is False, skipping this task 15627 1726882481.37970: _execute() done 15627 1726882481.37977: dumping result to json 15627 1726882481.37983: done dumping result, returning 15627 1726882481.37993: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-2847-7723-0000000002f9] 15627 1726882481.38002: sending task result for task 0e448fcc-3ce9-2847-7723-0000000002f9 15627 1726882481.38101: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000002f9 15627 1726882481.38107: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15627 1726882481.38182: no more pending results, returning what we have 15627 1726882481.38185: results queue empty 15627 1726882481.38186: checking for any_errors_fatal 15627 1726882481.38193: done checking for any_errors_fatal 15627 1726882481.38194: checking for max_fail_percentage 15627 1726882481.38195: done checking for max_fail_percentage 15627 1726882481.38196: checking to see if all hosts have failed and the running result is not ok 15627 1726882481.38197: done checking to see if all hosts have failed 15627 1726882481.38199: getting the remaining hosts for this loop 15627 1726882481.38201: done getting the remaining hosts for this loop 15627 1726882481.38205: getting the next task for host managed_node1 15627 1726882481.38215: done getting next task for host managed_node1 15627 1726882481.38219: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15627 1726882481.38222: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882481.38236: getting variables 15627 1726882481.38237: in VariableManager get_vars() 15627 1726882481.38275: Calling all_inventory to load vars for managed_node1 15627 1726882481.38278: Calling groups_inventory to load vars for managed_node1 15627 1726882481.38281: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882481.38290: Calling all_plugins_play to load vars for managed_node1 15627 1726882481.38294: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882481.38296: Calling groups_plugins_play to load vars for managed_node1 15627 1726882481.40108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882481.41860: done with get_vars() 15627 1726882481.41883: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:34:41 -0400 (0:00:00.062) 0:00:21.171 ****** 15627 1726882481.41981: entering _queue_task() for managed_node1/service_facts 15627 1726882481.42260: worker is 1 (out of 1 available) 15627 1726882481.42279: exiting _queue_task() for managed_node1/service_facts 15627 1726882481.42291: done queuing things up, now waiting for results queue to drain 15627 1726882481.42293: waiting for pending results... 15627 1726882481.42566: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 15627 1726882481.42688: in run() - task 0e448fcc-3ce9-2847-7723-0000000002fb 15627 1726882481.42707: variable 'ansible_search_path' from source: unknown 15627 1726882481.42719: variable 'ansible_search_path' from source: unknown 15627 1726882481.42760: calling self._execute() 15627 1726882481.42859: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882481.42874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882481.42888: variable 'omit' from source: magic vars 15627 1726882481.43267: variable 'ansible_distribution_major_version' from source: facts 15627 1726882481.43287: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882481.43298: variable 'omit' from source: magic vars 15627 1726882481.43355: variable 'omit' from source: magic vars 15627 1726882481.43400: variable 'omit' from source: magic vars 15627 1726882481.43438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882481.43478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882481.43508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882481.43528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882481.43543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882481.43576: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882481.43584: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882481.43597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882481.43697: Set connection var ansible_timeout to 10 15627 1726882481.43715: Set connection var ansible_shell_executable to /bin/sh 15627 1726882481.43723: Set connection var ansible_connection to ssh 15627 1726882481.43732: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882481.43740: Set connection var ansible_pipelining to False 15627 1726882481.43746: Set connection var ansible_shell_type to sh 15627 1726882481.43773: variable 'ansible_shell_executable' from source: unknown 15627 1726882481.43780: variable 'ansible_connection' from source: unknown 15627 1726882481.43786: variable 'ansible_module_compression' from source: unknown 15627 1726882481.43792: variable 'ansible_shell_type' from source: unknown 15627 1726882481.43797: variable 'ansible_shell_executable' from source: unknown 15627 1726882481.43803: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882481.43816: variable 'ansible_pipelining' from source: unknown 15627 1726882481.43822: variable 'ansible_timeout' from source: unknown 15627 1726882481.43828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882481.44018: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882481.44039: variable 'omit' from source: magic vars 15627 1726882481.44047: starting attempt loop 15627 1726882481.44053: running the handler 15627 1726882481.44071: _low_level_execute_command(): starting 15627 1726882481.44082: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882481.44849: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882481.44867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882481.44883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882481.44907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882481.44952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882481.44967: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882481.44982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882481.45000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882481.45015: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882481.45025: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882481.45036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882481.45047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882481.45061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882481.45075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882481.45085: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882481.45097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882481.45180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882481.45197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882481.45212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882481.45353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882481.47005: stdout chunk (state=3): >>>/root <<< 15627 1726882481.47173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882481.47178: stdout chunk (state=3): >>><<< 15627 1726882481.47186: stderr chunk (state=3): >>><<< 15627 1726882481.47204: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882481.47218: _low_level_execute_command(): starting 15627 1726882481.47222: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882481.4720333-16538-155228320752856 `" && echo ansible-tmp-1726882481.4720333-16538-155228320752856="` echo /root/.ansible/tmp/ansible-tmp-1726882481.4720333-16538-155228320752856 `" ) && sleep 0' 15627 1726882481.47834: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882481.47843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882481.47853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882481.47871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882481.47909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882481.47916: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882481.47925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882481.47938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882481.47944: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882481.47951: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882481.47962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882481.47974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882481.47985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882481.47994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882481.47999: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882481.48007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882481.48083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882481.48096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882481.48109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882481.48226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882481.50084: stdout chunk (state=3): >>>ansible-tmp-1726882481.4720333-16538-155228320752856=/root/.ansible/tmp/ansible-tmp-1726882481.4720333-16538-155228320752856 <<< 15627 1726882481.50251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882481.50254: stdout chunk (state=3): >>><<< 15627 1726882481.50267: stderr chunk (state=3): >>><<< 15627 1726882481.50282: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882481.4720333-16538-155228320752856=/root/.ansible/tmp/ansible-tmp-1726882481.4720333-16538-155228320752856 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882481.50327: variable 'ansible_module_compression' from source: unknown 15627 1726882481.50372: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15627 1726882481.50415: variable 'ansible_facts' from source: unknown 15627 1726882481.50483: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882481.4720333-16538-155228320752856/AnsiballZ_service_facts.py 15627 1726882481.50599: Sending initial data 15627 1726882481.50602: Sent initial data (162 bytes) 15627 1726882481.51269: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882481.51273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882481.51303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882481.51309: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882481.51332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 15627 1726882481.51335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882481.51338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882481.51395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882481.51398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882481.51495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882481.53205: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882481.53293: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882481.53383: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpheo6_hzh /root/.ansible/tmp/ansible-tmp-1726882481.4720333-16538-155228320752856/AnsiballZ_service_facts.py <<< 15627 1726882481.53477: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882481.54502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882481.54594: stderr chunk (state=3): >>><<< 15627 1726882481.54597: stdout chunk (state=3): >>><<< 15627 1726882481.54614: done transferring module to remote 15627 1726882481.54622: _low_level_execute_command(): starting 15627 1726882481.54627: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882481.4720333-16538-155228320752856/ /root/.ansible/tmp/ansible-tmp-1726882481.4720333-16538-155228320752856/AnsiballZ_service_facts.py && sleep 0' 15627 1726882481.55084: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882481.55113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882481.55119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882481.55136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882481.55141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882481.55223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882481.55339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882481.57065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882481.57138: stderr chunk (state=3): >>><<< 15627 1726882481.57147: stdout chunk (state=3): >>><<< 15627 1726882481.57247: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882481.57250: _low_level_execute_command(): starting 15627 1726882481.57253: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882481.4720333-16538-155228320752856/AnsiballZ_service_facts.py && sleep 0' 15627 1726882481.57832: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882481.57844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882481.57862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882481.57883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882481.57934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882481.57946: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882481.57966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882481.57985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882481.57998: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882481.58019: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882481.58033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882481.58047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882481.58070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882481.58083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882481.58095: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882481.58111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882481.58196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882481.58217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882481.58243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882481.58379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882482.89757: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-upda<<< 15627 1726882482.89771: stdout chunk (state=3): >>>te.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "s<<< 15627 1726882482.89773: stdout chunk (state=3): >>>ystemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "syste<<< 15627 1726882482.89776: stdout chunk (state=3): >>>md"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "<<< 15627 1726882482.89778: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15627 1726882482.91054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882482.91062: stdout chunk (state=3): >>><<< 15627 1726882482.91067: stderr chunk (state=3): >>><<< 15627 1726882482.91474: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882482.91776: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882481.4720333-16538-155228320752856/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882482.91793: _low_level_execute_command(): starting 15627 1726882482.91804: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882481.4720333-16538-155228320752856/ > /dev/null 2>&1 && sleep 0' 15627 1726882482.92466: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882482.92481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882482.92494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882482.92511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882482.92552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882482.92570: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882482.92584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882482.92601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882482.92611: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882482.92621: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882482.92631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882482.92642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882482.92656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882482.92671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882482.92684: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882482.92696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882482.92775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882482.92793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882482.92807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882482.92933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882482.94780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882482.94807: stderr chunk (state=3): >>><<< 15627 1726882482.94810: stdout chunk (state=3): >>><<< 15627 1726882482.94969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882482.94972: handler run complete 15627 1726882482.95175: variable 'ansible_facts' from source: unknown 15627 1726882482.95178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882482.95704: variable 'ansible_facts' from source: unknown 15627 1726882482.95838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882482.96042: attempt loop complete, returning result 15627 1726882482.96051: _execute() done 15627 1726882482.96060: dumping result to json 15627 1726882482.96123: done dumping result, returning 15627 1726882482.96137: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-2847-7723-0000000002fb] 15627 1726882482.96146: sending task result for task 0e448fcc-3ce9-2847-7723-0000000002fb ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882482.96882: no more pending results, returning what we have 15627 1726882482.96885: results queue empty 15627 1726882482.96886: checking for any_errors_fatal 15627 1726882482.96890: done checking for any_errors_fatal 15627 1726882482.96891: checking for max_fail_percentage 15627 1726882482.96893: done checking for max_fail_percentage 15627 1726882482.96894: checking to see if all hosts have failed and the running result is not ok 15627 1726882482.96895: done checking to see if all hosts have failed 15627 1726882482.96896: getting the remaining hosts for this loop 15627 1726882482.96897: done getting the remaining hosts for this loop 15627 1726882482.96901: getting the next task for host managed_node1 15627 1726882482.96908: done getting next task for host managed_node1 15627 1726882482.96912: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15627 1726882482.96915: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882482.96924: getting variables 15627 1726882482.96926: in VariableManager get_vars() 15627 1726882482.96961: Calling all_inventory to load vars for managed_node1 15627 1726882482.96965: Calling groups_inventory to load vars for managed_node1 15627 1726882482.96968: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882482.96978: Calling all_plugins_play to load vars for managed_node1 15627 1726882482.96981: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882482.96984: Calling groups_plugins_play to load vars for managed_node1 15627 1726882482.98384: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000002fb 15627 1726882482.98387: WORKER PROCESS EXITING 15627 1726882482.98827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882483.00582: done with get_vars() 15627 1726882483.00609: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:34:43 -0400 (0:00:01.587) 0:00:22.758 ****** 15627 1726882483.00710: entering _queue_task() for managed_node1/package_facts 15627 1726882483.01001: worker is 1 (out of 1 available) 15627 1726882483.01012: exiting _queue_task() for managed_node1/package_facts 15627 1726882483.01025: done queuing things up, now waiting for results queue to drain 15627 1726882483.01026: waiting for pending results... 15627 1726882483.01297: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15627 1726882483.01429: in run() - task 0e448fcc-3ce9-2847-7723-0000000002fc 15627 1726882483.01448: variable 'ansible_search_path' from source: unknown 15627 1726882483.01458: variable 'ansible_search_path' from source: unknown 15627 1726882483.01502: calling self._execute() 15627 1726882483.01601: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882483.01613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882483.01626: variable 'omit' from source: magic vars 15627 1726882483.01994: variable 'ansible_distribution_major_version' from source: facts 15627 1726882483.02015: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882483.02025: variable 'omit' from source: magic vars 15627 1726882483.02088: variable 'omit' from source: magic vars 15627 1726882483.02128: variable 'omit' from source: magic vars 15627 1726882483.02174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882483.02212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882483.02240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882483.02266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882483.02283: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882483.02313: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882483.02321: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882483.02328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882483.02428: Set connection var ansible_timeout to 10 15627 1726882483.02445: Set connection var ansible_shell_executable to /bin/sh 15627 1726882483.02456: Set connection var ansible_connection to ssh 15627 1726882483.02469: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882483.02478: Set connection var ansible_pipelining to False 15627 1726882483.02484: Set connection var ansible_shell_type to sh 15627 1726882483.02508: variable 'ansible_shell_executable' from source: unknown 15627 1726882483.02516: variable 'ansible_connection' from source: unknown 15627 1726882483.02522: variable 'ansible_module_compression' from source: unknown 15627 1726882483.02528: variable 'ansible_shell_type' from source: unknown 15627 1726882483.02534: variable 'ansible_shell_executable' from source: unknown 15627 1726882483.02540: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882483.02548: variable 'ansible_pipelining' from source: unknown 15627 1726882483.02558: variable 'ansible_timeout' from source: unknown 15627 1726882483.02568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882483.02760: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882483.02779: variable 'omit' from source: magic vars 15627 1726882483.02788: starting attempt loop 15627 1726882483.02794: running the handler 15627 1726882483.02808: _low_level_execute_command(): starting 15627 1726882483.02819: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882483.03590: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882483.03604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882483.03618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882483.03636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882483.03688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882483.03699: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882483.03712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882483.03730: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882483.03740: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882483.03753: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882483.03772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882483.03788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882483.03808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882483.03822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882483.03837: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882483.03859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882483.03940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882483.03970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882483.03987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882483.04112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882483.05710: stdout chunk (state=3): >>>/root <<< 15627 1726882483.05820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882483.05901: stderr chunk (state=3): >>><<< 15627 1726882483.05917: stdout chunk (state=3): >>><<< 15627 1726882483.05970: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882483.05974: _low_level_execute_command(): starting 15627 1726882483.06056: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882483.059486-16589-253358749258961 `" && echo ansible-tmp-1726882483.059486-16589-253358749258961="` echo /root/.ansible/tmp/ansible-tmp-1726882483.059486-16589-253358749258961 `" ) && sleep 0' 15627 1726882483.06650: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882483.06670: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882483.06685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882483.06703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882483.06762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882483.06787: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882483.06813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882483.06846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882483.06874: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882483.06899: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882483.06921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882483.06943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882483.06978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882483.06999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882483.07015: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882483.07031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882483.07122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882483.07144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882483.07165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882483.07297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882483.09138: stdout chunk (state=3): >>>ansible-tmp-1726882483.059486-16589-253358749258961=/root/.ansible/tmp/ansible-tmp-1726882483.059486-16589-253358749258961 <<< 15627 1726882483.09370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882483.09374: stdout chunk (state=3): >>><<< 15627 1726882483.09377: stderr chunk (state=3): >>><<< 15627 1726882483.09380: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882483.059486-16589-253358749258961=/root/.ansible/tmp/ansible-tmp-1726882483.059486-16589-253358749258961 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882483.09471: variable 'ansible_module_compression' from source: unknown 15627 1726882483.09475: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15627 1726882483.09570: variable 'ansible_facts' from source: unknown 15627 1726882483.09738: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882483.059486-16589-253358749258961/AnsiballZ_package_facts.py 15627 1726882483.10393: Sending initial data 15627 1726882483.10396: Sent initial data (161 bytes) 15627 1726882483.12617: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882483.12631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882483.12644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882483.12668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882483.12802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882483.12815: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882483.12829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882483.12845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882483.12859: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882483.12873: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882483.12884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882483.12897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882483.12912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882483.12924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882483.12935: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882483.12948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882483.13027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882483.13189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882483.13208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882483.13494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882483.15259: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882483.15348: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882483.15445: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpezihalu8 /root/.ansible/tmp/ansible-tmp-1726882483.059486-16589-253358749258961/AnsiballZ_package_facts.py <<< 15627 1726882483.15537: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882483.18485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882483.18671: stderr chunk (state=3): >>><<< 15627 1726882483.18675: stdout chunk (state=3): >>><<< 15627 1726882483.18677: done transferring module to remote 15627 1726882483.18686: _low_level_execute_command(): starting 15627 1726882483.18688: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882483.059486-16589-253358749258961/ /root/.ansible/tmp/ansible-tmp-1726882483.059486-16589-253358749258961/AnsiballZ_package_facts.py && sleep 0' 15627 1726882483.19271: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882483.19289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882483.19304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882483.19323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882483.19369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882483.19382: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882483.19396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882483.19414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882483.19426: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882483.19437: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882483.19449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882483.19473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882483.19494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882483.19507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882483.19518: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882483.19530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882483.19602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882483.19619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882483.19633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882483.19811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882483.21520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882483.21584: stderr chunk (state=3): >>><<< 15627 1726882483.21587: stdout chunk (state=3): >>><<< 15627 1726882483.21678: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882483.21681: _low_level_execute_command(): starting 15627 1726882483.21684: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882483.059486-16589-253358749258961/AnsiballZ_package_facts.py && sleep 0' 15627 1726882483.23228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882483.23243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882483.23259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882483.23280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882483.23317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882483.23328: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882483.23339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882483.23356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882483.23369: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882483.23378: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882483.23387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882483.23398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882483.23410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882483.23419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882483.23428: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882483.23439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882483.23516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882483.23532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882483.23546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882483.23892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882483.69475: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 15627 1726882483.69526: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 15627 1726882483.69537: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 15627 1726882483.69622: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 15627 1726882483.69632: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 15627 1726882483.69635: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 15627 1726882483.69641: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 15627 1726882483.69676: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 15627 1726882483.69705: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 15627 1726882483.69708: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15627 1726882483.71247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882483.71251: stdout chunk (state=3): >>><<< 15627 1726882483.71253: stderr chunk (state=3): >>><<< 15627 1726882483.71975: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882483.73881: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882483.059486-16589-253358749258961/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882483.73906: _low_level_execute_command(): starting 15627 1726882483.73915: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882483.059486-16589-253358749258961/ > /dev/null 2>&1 && sleep 0' 15627 1726882483.74548: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882483.74566: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882483.74581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882483.74597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882483.74639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882483.74650: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882483.74668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882483.74688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882483.74699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882483.74709: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882483.74720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882483.74733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882483.74747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882483.74760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882483.74772: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882483.74784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882483.74853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882483.74886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882483.74901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882483.75028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882483.76879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882483.76967: stderr chunk (state=3): >>><<< 15627 1726882483.76971: stdout chunk (state=3): >>><<< 15627 1726882483.77271: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882483.77275: handler run complete 15627 1726882483.78064: variable 'ansible_facts' from source: unknown 15627 1726882483.78593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882483.80925: variable 'ansible_facts' from source: unknown 15627 1726882483.81392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882483.82400: attempt loop complete, returning result 15627 1726882483.82422: _execute() done 15627 1726882483.82429: dumping result to json 15627 1726882483.82678: done dumping result, returning 15627 1726882483.82692: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-2847-7723-0000000002fc] 15627 1726882483.82701: sending task result for task 0e448fcc-3ce9-2847-7723-0000000002fc 15627 1726882483.84931: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000002fc 15627 1726882483.84934: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882483.85093: no more pending results, returning what we have 15627 1726882483.85096: results queue empty 15627 1726882483.85097: checking for any_errors_fatal 15627 1726882483.85102: done checking for any_errors_fatal 15627 1726882483.85103: checking for max_fail_percentage 15627 1726882483.85104: done checking for max_fail_percentage 15627 1726882483.85105: checking to see if all hosts have failed and the running result is not ok 15627 1726882483.85106: done checking to see if all hosts have failed 15627 1726882483.85107: getting the remaining hosts for this loop 15627 1726882483.85109: done getting the remaining hosts for this loop 15627 1726882483.85112: getting the next task for host managed_node1 15627 1726882483.85120: done getting next task for host managed_node1 15627 1726882483.85123: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15627 1726882483.85125: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882483.85133: getting variables 15627 1726882483.85135: in VariableManager get_vars() 15627 1726882483.85168: Calling all_inventory to load vars for managed_node1 15627 1726882483.85171: Calling groups_inventory to load vars for managed_node1 15627 1726882483.85173: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882483.85183: Calling all_plugins_play to load vars for managed_node1 15627 1726882483.85185: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882483.85188: Calling groups_plugins_play to load vars for managed_node1 15627 1726882483.86631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882483.88382: done with get_vars() 15627 1726882483.88405: done getting variables 15627 1726882483.88466: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:34:43 -0400 (0:00:00.877) 0:00:23.636 ****** 15627 1726882483.88502: entering _queue_task() for managed_node1/debug 15627 1726882483.88763: worker is 1 (out of 1 available) 15627 1726882483.88778: exiting _queue_task() for managed_node1/debug 15627 1726882483.88789: done queuing things up, now waiting for results queue to drain 15627 1726882483.88791: waiting for pending results... 15627 1726882483.89054: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 15627 1726882483.89160: in run() - task 0e448fcc-3ce9-2847-7723-00000000003b 15627 1726882483.89182: variable 'ansible_search_path' from source: unknown 15627 1726882483.89189: variable 'ansible_search_path' from source: unknown 15627 1726882483.89222: calling self._execute() 15627 1726882483.89317: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882483.89328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882483.89349: variable 'omit' from source: magic vars 15627 1726882483.89721: variable 'ansible_distribution_major_version' from source: facts 15627 1726882483.89738: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882483.89750: variable 'omit' from source: magic vars 15627 1726882483.89795: variable 'omit' from source: magic vars 15627 1726882483.89895: variable 'network_provider' from source: set_fact 15627 1726882483.89918: variable 'omit' from source: magic vars 15627 1726882483.89961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882483.90006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882483.90031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882483.90053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882483.90072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882483.90109: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882483.90118: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882483.90127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882483.90236: Set connection var ansible_timeout to 10 15627 1726882483.90250: Set connection var ansible_shell_executable to /bin/sh 15627 1726882483.90260: Set connection var ansible_connection to ssh 15627 1726882483.90272: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882483.90282: Set connection var ansible_pipelining to False 15627 1726882483.90289: Set connection var ansible_shell_type to sh 15627 1726882483.90319: variable 'ansible_shell_executable' from source: unknown 15627 1726882483.90328: variable 'ansible_connection' from source: unknown 15627 1726882483.90335: variable 'ansible_module_compression' from source: unknown 15627 1726882483.90342: variable 'ansible_shell_type' from source: unknown 15627 1726882483.90349: variable 'ansible_shell_executable' from source: unknown 15627 1726882483.90355: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882483.90366: variable 'ansible_pipelining' from source: unknown 15627 1726882483.90374: variable 'ansible_timeout' from source: unknown 15627 1726882483.90382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882483.90520: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882483.90542: variable 'omit' from source: magic vars 15627 1726882483.90553: starting attempt loop 15627 1726882483.90560: running the handler 15627 1726882483.90609: handler run complete 15627 1726882483.90628: attempt loop complete, returning result 15627 1726882483.90637: _execute() done 15627 1726882483.90647: dumping result to json 15627 1726882483.90654: done dumping result, returning 15627 1726882483.90672: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-2847-7723-00000000003b] 15627 1726882483.90682: sending task result for task 0e448fcc-3ce9-2847-7723-00000000003b ok: [managed_node1] => {} MSG: Using network provider: nm 15627 1726882483.90828: no more pending results, returning what we have 15627 1726882483.90831: results queue empty 15627 1726882483.90832: checking for any_errors_fatal 15627 1726882483.90840: done checking for any_errors_fatal 15627 1726882483.90841: checking for max_fail_percentage 15627 1726882483.90843: done checking for max_fail_percentage 15627 1726882483.90844: checking to see if all hosts have failed and the running result is not ok 15627 1726882483.90846: done checking to see if all hosts have failed 15627 1726882483.90846: getting the remaining hosts for this loop 15627 1726882483.90848: done getting the remaining hosts for this loop 15627 1726882483.90852: getting the next task for host managed_node1 15627 1726882483.90859: done getting next task for host managed_node1 15627 1726882483.90866: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15627 1726882483.90868: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882483.90879: getting variables 15627 1726882483.90881: in VariableManager get_vars() 15627 1726882483.90916: Calling all_inventory to load vars for managed_node1 15627 1726882483.90919: Calling groups_inventory to load vars for managed_node1 15627 1726882483.90922: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882483.90932: Calling all_plugins_play to load vars for managed_node1 15627 1726882483.90936: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882483.90939: Calling groups_plugins_play to load vars for managed_node1 15627 1726882483.91984: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000003b 15627 1726882483.91987: WORKER PROCESS EXITING 15627 1726882483.92570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882483.94376: done with get_vars() 15627 1726882483.94396: done getting variables 15627 1726882483.94449: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:34:43 -0400 (0:00:00.059) 0:00:23.696 ****** 15627 1726882483.94481: entering _queue_task() for managed_node1/fail 15627 1726882483.94716: worker is 1 (out of 1 available) 15627 1726882483.94727: exiting _queue_task() for managed_node1/fail 15627 1726882483.94740: done queuing things up, now waiting for results queue to drain 15627 1726882483.94741: waiting for pending results... 15627 1726882483.94998: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15627 1726882483.95108: in run() - task 0e448fcc-3ce9-2847-7723-00000000003c 15627 1726882483.95125: variable 'ansible_search_path' from source: unknown 15627 1726882483.95134: variable 'ansible_search_path' from source: unknown 15627 1726882483.95173: calling self._execute() 15627 1726882483.95267: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882483.95279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882483.95296: variable 'omit' from source: magic vars 15627 1726882483.95660: variable 'ansible_distribution_major_version' from source: facts 15627 1726882483.95677: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882483.95789: variable 'network_state' from source: role '' defaults 15627 1726882483.95803: Evaluated conditional (network_state != {}): False 15627 1726882483.95811: when evaluation is False, skipping this task 15627 1726882483.95819: _execute() done 15627 1726882483.95827: dumping result to json 15627 1726882483.95838: done dumping result, returning 15627 1726882483.95849: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-2847-7723-00000000003c] 15627 1726882483.95860: sending task result for task 0e448fcc-3ce9-2847-7723-00000000003c skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882483.96009: no more pending results, returning what we have 15627 1726882483.96013: results queue empty 15627 1726882483.96014: checking for any_errors_fatal 15627 1726882483.96022: done checking for any_errors_fatal 15627 1726882483.96023: checking for max_fail_percentage 15627 1726882483.96025: done checking for max_fail_percentage 15627 1726882483.96027: checking to see if all hosts have failed and the running result is not ok 15627 1726882483.96028: done checking to see if all hosts have failed 15627 1726882483.96029: getting the remaining hosts for this loop 15627 1726882483.96031: done getting the remaining hosts for this loop 15627 1726882483.96035: getting the next task for host managed_node1 15627 1726882483.96043: done getting next task for host managed_node1 15627 1726882483.96048: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15627 1726882483.96050: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882483.96068: getting variables 15627 1726882483.96070: in VariableManager get_vars() 15627 1726882483.96109: Calling all_inventory to load vars for managed_node1 15627 1726882483.96113: Calling groups_inventory to load vars for managed_node1 15627 1726882483.96115: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882483.96128: Calling all_plugins_play to load vars for managed_node1 15627 1726882483.96131: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882483.96134: Calling groups_plugins_play to load vars for managed_node1 15627 1726882483.97383: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000003c 15627 1726882483.97387: WORKER PROCESS EXITING 15627 1726882483.97767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882483.99452: done with get_vars() 15627 1726882483.99478: done getting variables 15627 1726882483.99534: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:34:43 -0400 (0:00:00.050) 0:00:23.747 ****** 15627 1726882483.99567: entering _queue_task() for managed_node1/fail 15627 1726882483.99815: worker is 1 (out of 1 available) 15627 1726882483.99828: exiting _queue_task() for managed_node1/fail 15627 1726882483.99840: done queuing things up, now waiting for results queue to drain 15627 1726882483.99842: waiting for pending results... 15627 1726882484.00110: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15627 1726882484.00211: in run() - task 0e448fcc-3ce9-2847-7723-00000000003d 15627 1726882484.00228: variable 'ansible_search_path' from source: unknown 15627 1726882484.00236: variable 'ansible_search_path' from source: unknown 15627 1726882484.00274: calling self._execute() 15627 1726882484.00369: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882484.00380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882484.00397: variable 'omit' from source: magic vars 15627 1726882484.00737: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.00752: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882484.00876: variable 'network_state' from source: role '' defaults 15627 1726882484.00893: Evaluated conditional (network_state != {}): False 15627 1726882484.00902: when evaluation is False, skipping this task 15627 1726882484.00909: _execute() done 15627 1726882484.00916: dumping result to json 15627 1726882484.00922: done dumping result, returning 15627 1726882484.00934: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-2847-7723-00000000003d] 15627 1726882484.00946: sending task result for task 0e448fcc-3ce9-2847-7723-00000000003d skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882484.01099: no more pending results, returning what we have 15627 1726882484.01103: results queue empty 15627 1726882484.01104: checking for any_errors_fatal 15627 1726882484.01114: done checking for any_errors_fatal 15627 1726882484.01115: checking for max_fail_percentage 15627 1726882484.01117: done checking for max_fail_percentage 15627 1726882484.01118: checking to see if all hosts have failed and the running result is not ok 15627 1726882484.01120: done checking to see if all hosts have failed 15627 1726882484.01120: getting the remaining hosts for this loop 15627 1726882484.01123: done getting the remaining hosts for this loop 15627 1726882484.01127: getting the next task for host managed_node1 15627 1726882484.01135: done getting next task for host managed_node1 15627 1726882484.01140: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15627 1726882484.01142: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882484.01158: getting variables 15627 1726882484.01160: in VariableManager get_vars() 15627 1726882484.01204: Calling all_inventory to load vars for managed_node1 15627 1726882484.01207: Calling groups_inventory to load vars for managed_node1 15627 1726882484.01210: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882484.01222: Calling all_plugins_play to load vars for managed_node1 15627 1726882484.01226: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882484.01229: Calling groups_plugins_play to load vars for managed_node1 15627 1726882484.02385: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000003d 15627 1726882484.02389: WORKER PROCESS EXITING 15627 1726882484.07649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882484.09299: done with get_vars() 15627 1726882484.09323: done getting variables 15627 1726882484.09372: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:34:44 -0400 (0:00:00.098) 0:00:23.845 ****** 15627 1726882484.09399: entering _queue_task() for managed_node1/fail 15627 1726882484.09721: worker is 1 (out of 1 available) 15627 1726882484.09733: exiting _queue_task() for managed_node1/fail 15627 1726882484.09746: done queuing things up, now waiting for results queue to drain 15627 1726882484.09747: waiting for pending results... 15627 1726882484.10032: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15627 1726882484.10148: in run() - task 0e448fcc-3ce9-2847-7723-00000000003e 15627 1726882484.10170: variable 'ansible_search_path' from source: unknown 15627 1726882484.10179: variable 'ansible_search_path' from source: unknown 15627 1726882484.10224: calling self._execute() 15627 1726882484.10326: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882484.10341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882484.10358: variable 'omit' from source: magic vars 15627 1726882484.10745: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.10762: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882484.10939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882484.13311: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882484.13392: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882484.13434: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882484.13475: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882484.13506: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882484.13587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.13620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.13651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.13701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.13720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.13816: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.13835: Evaluated conditional (ansible_distribution_major_version | int > 9): False 15627 1726882484.13842: when evaluation is False, skipping this task 15627 1726882484.13849: _execute() done 15627 1726882484.13856: dumping result to json 15627 1726882484.13866: done dumping result, returning 15627 1726882484.13881: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-2847-7723-00000000003e] 15627 1726882484.13891: sending task result for task 0e448fcc-3ce9-2847-7723-00000000003e skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 15627 1726882484.14037: no more pending results, returning what we have 15627 1726882484.14040: results queue empty 15627 1726882484.14041: checking for any_errors_fatal 15627 1726882484.14049: done checking for any_errors_fatal 15627 1726882484.14050: checking for max_fail_percentage 15627 1726882484.14052: done checking for max_fail_percentage 15627 1726882484.14052: checking to see if all hosts have failed and the running result is not ok 15627 1726882484.14054: done checking to see if all hosts have failed 15627 1726882484.14054: getting the remaining hosts for this loop 15627 1726882484.14056: done getting the remaining hosts for this loop 15627 1726882484.14060: getting the next task for host managed_node1 15627 1726882484.14070: done getting next task for host managed_node1 15627 1726882484.14075: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15627 1726882484.14076: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882484.14089: getting variables 15627 1726882484.14091: in VariableManager get_vars() 15627 1726882484.14128: Calling all_inventory to load vars for managed_node1 15627 1726882484.14131: Calling groups_inventory to load vars for managed_node1 15627 1726882484.14133: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882484.14142: Calling all_plugins_play to load vars for managed_node1 15627 1726882484.14145: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882484.14148: Calling groups_plugins_play to load vars for managed_node1 15627 1726882484.15283: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000003e 15627 1726882484.15286: WORKER PROCESS EXITING 15627 1726882484.15866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882484.16836: done with get_vars() 15627 1726882484.16852: done getting variables 15627 1726882484.16895: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:34:44 -0400 (0:00:00.075) 0:00:23.920 ****** 15627 1726882484.16917: entering _queue_task() for managed_node1/dnf 15627 1726882484.17130: worker is 1 (out of 1 available) 15627 1726882484.17144: exiting _queue_task() for managed_node1/dnf 15627 1726882484.17157: done queuing things up, now waiting for results queue to drain 15627 1726882484.17158: waiting for pending results... 15627 1726882484.17339: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15627 1726882484.17411: in run() - task 0e448fcc-3ce9-2847-7723-00000000003f 15627 1726882484.17422: variable 'ansible_search_path' from source: unknown 15627 1726882484.17426: variable 'ansible_search_path' from source: unknown 15627 1726882484.17457: calling self._execute() 15627 1726882484.17533: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882484.17537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882484.17547: variable 'omit' from source: magic vars 15627 1726882484.18174: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.18190: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882484.18398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882484.22088: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882484.22766: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882484.22808: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882484.22847: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882484.22884: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882484.22966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.23003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.23034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.23087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.23109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.23232: variable 'ansible_distribution' from source: facts 15627 1726882484.23276: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.23296: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15627 1726882484.23445: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882484.23597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.23626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.23659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.23709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.23730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.23780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.23809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.23839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.23888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.23906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.23945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.23980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.24011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.24058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.24080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.24236: variable 'network_connections' from source: play vars 15627 1726882484.24256: variable 'profile' from source: play vars 15627 1726882484.24325: variable 'profile' from source: play vars 15627 1726882484.24335: variable 'interface' from source: set_fact 15627 1726882484.24404: variable 'interface' from source: set_fact 15627 1726882484.24481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882484.24648: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882484.24818: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882484.24852: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882484.25000: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882484.25046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882484.25284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882484.25321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.25352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882484.25406: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882484.25905: variable 'network_connections' from source: play vars 15627 1726882484.25971: variable 'profile' from source: play vars 15627 1726882484.26236: variable 'profile' from source: play vars 15627 1726882484.26246: variable 'interface' from source: set_fact 15627 1726882484.26386: variable 'interface' from source: set_fact 15627 1726882484.26428: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15627 1726882484.26459: when evaluation is False, skipping this task 15627 1726882484.26471: _execute() done 15627 1726882484.26478: dumping result to json 15627 1726882484.26485: done dumping result, returning 15627 1726882484.26500: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-2847-7723-00000000003f] 15627 1726882484.26509: sending task result for task 0e448fcc-3ce9-2847-7723-00000000003f skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15627 1726882484.26675: no more pending results, returning what we have 15627 1726882484.26678: results queue empty 15627 1726882484.26679: checking for any_errors_fatal 15627 1726882484.26685: done checking for any_errors_fatal 15627 1726882484.26686: checking for max_fail_percentage 15627 1726882484.26688: done checking for max_fail_percentage 15627 1726882484.26688: checking to see if all hosts have failed and the running result is not ok 15627 1726882484.26689: done checking to see if all hosts have failed 15627 1726882484.26690: getting the remaining hosts for this loop 15627 1726882484.26692: done getting the remaining hosts for this loop 15627 1726882484.26695: getting the next task for host managed_node1 15627 1726882484.26701: done getting next task for host managed_node1 15627 1726882484.26705: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15627 1726882484.26707: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882484.26723: getting variables 15627 1726882484.26725: in VariableManager get_vars() 15627 1726882484.26761: Calling all_inventory to load vars for managed_node1 15627 1726882484.26766: Calling groups_inventory to load vars for managed_node1 15627 1726882484.26768: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882484.26779: Calling all_plugins_play to load vars for managed_node1 15627 1726882484.26781: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882484.26785: Calling groups_plugins_play to load vars for managed_node1 15627 1726882484.27312: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000003f 15627 1726882484.27315: WORKER PROCESS EXITING 15627 1726882484.28455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882484.32537: done with get_vars() 15627 1726882484.32561: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15627 1726882484.32675: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:34:44 -0400 (0:00:00.157) 0:00:24.078 ****** 15627 1726882484.32708: entering _queue_task() for managed_node1/yum 15627 1726882484.33007: worker is 1 (out of 1 available) 15627 1726882484.33019: exiting _queue_task() for managed_node1/yum 15627 1726882484.33034: done queuing things up, now waiting for results queue to drain 15627 1726882484.33036: waiting for pending results... 15627 1726882484.33321: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15627 1726882484.33442: in run() - task 0e448fcc-3ce9-2847-7723-000000000040 15627 1726882484.33462: variable 'ansible_search_path' from source: unknown 15627 1726882484.33476: variable 'ansible_search_path' from source: unknown 15627 1726882484.33517: calling self._execute() 15627 1726882484.33628: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882484.33641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882484.33657: variable 'omit' from source: magic vars 15627 1726882484.34047: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.34066: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882484.34267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882484.36669: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882484.36746: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882484.36790: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882484.36858: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882484.36893: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882484.36973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.37009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.37040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.37094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.37114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.37215: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.37233: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15627 1726882484.37241: when evaluation is False, skipping this task 15627 1726882484.37248: _execute() done 15627 1726882484.37254: dumping result to json 15627 1726882484.37261: done dumping result, returning 15627 1726882484.37274: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-2847-7723-000000000040] 15627 1726882484.37284: sending task result for task 0e448fcc-3ce9-2847-7723-000000000040 15627 1726882484.37391: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000040 15627 1726882484.37399: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15627 1726882484.37451: no more pending results, returning what we have 15627 1726882484.37454: results queue empty 15627 1726882484.37455: checking for any_errors_fatal 15627 1726882484.37462: done checking for any_errors_fatal 15627 1726882484.37465: checking for max_fail_percentage 15627 1726882484.37467: done checking for max_fail_percentage 15627 1726882484.37468: checking to see if all hosts have failed and the running result is not ok 15627 1726882484.37469: done checking to see if all hosts have failed 15627 1726882484.37470: getting the remaining hosts for this loop 15627 1726882484.37472: done getting the remaining hosts for this loop 15627 1726882484.37475: getting the next task for host managed_node1 15627 1726882484.37483: done getting next task for host managed_node1 15627 1726882484.37487: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15627 1726882484.37489: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882484.37502: getting variables 15627 1726882484.37503: in VariableManager get_vars() 15627 1726882484.37541: Calling all_inventory to load vars for managed_node1 15627 1726882484.37545: Calling groups_inventory to load vars for managed_node1 15627 1726882484.37548: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882484.37557: Calling all_plugins_play to load vars for managed_node1 15627 1726882484.37561: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882484.37566: Calling groups_plugins_play to load vars for managed_node1 15627 1726882484.39395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882484.41924: done with get_vars() 15627 1726882484.41945: done getting variables 15627 1726882484.42002: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:34:44 -0400 (0:00:00.093) 0:00:24.171 ****** 15627 1726882484.42033: entering _queue_task() for managed_node1/fail 15627 1726882484.42852: worker is 1 (out of 1 available) 15627 1726882484.42917: exiting _queue_task() for managed_node1/fail 15627 1726882484.42955: done queuing things up, now waiting for results queue to drain 15627 1726882484.42957: waiting for pending results... 15627 1726882484.43726: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15627 1726882484.44176: in run() - task 0e448fcc-3ce9-2847-7723-000000000041 15627 1726882484.44280: variable 'ansible_search_path' from source: unknown 15627 1726882484.44357: variable 'ansible_search_path' from source: unknown 15627 1726882484.44431: calling self._execute() 15627 1726882484.44529: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882484.44540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882484.44559: variable 'omit' from source: magic vars 15627 1726882484.45119: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.45136: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882484.45283: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882484.45585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882484.48333: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882484.48441: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882484.48517: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882484.48581: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882484.48616: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882484.48760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.48804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.48856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.48917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.48956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.49078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.49122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.49153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.49203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.49231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.49287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.49328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.49428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.49534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.49554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.49844: variable 'network_connections' from source: play vars 15627 1726882484.49875: variable 'profile' from source: play vars 15627 1726882484.50017: variable 'profile' from source: play vars 15627 1726882484.50026: variable 'interface' from source: set_fact 15627 1726882484.50105: variable 'interface' from source: set_fact 15627 1726882484.50196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882484.50391: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882484.50441: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882484.50478: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882484.50517: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882484.50563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882484.50592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882484.50634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.50674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882484.50730: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882484.50984: variable 'network_connections' from source: play vars 15627 1726882484.50994: variable 'profile' from source: play vars 15627 1726882484.51062: variable 'profile' from source: play vars 15627 1726882484.51124: variable 'interface' from source: set_fact 15627 1726882484.51226: variable 'interface' from source: set_fact 15627 1726882484.51300: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15627 1726882484.51322: when evaluation is False, skipping this task 15627 1726882484.51339: _execute() done 15627 1726882484.51360: dumping result to json 15627 1726882484.51390: done dumping result, returning 15627 1726882484.51420: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-2847-7723-000000000041] 15627 1726882484.51438: sending task result for task 0e448fcc-3ce9-2847-7723-000000000041 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15627 1726882484.51602: no more pending results, returning what we have 15627 1726882484.51606: results queue empty 15627 1726882484.51606: checking for any_errors_fatal 15627 1726882484.51626: done checking for any_errors_fatal 15627 1726882484.51628: checking for max_fail_percentage 15627 1726882484.51631: done checking for max_fail_percentage 15627 1726882484.51654: checking to see if all hosts have failed and the running result is not ok 15627 1726882484.51658: done checking to see if all hosts have failed 15627 1726882484.51659: getting the remaining hosts for this loop 15627 1726882484.51661: done getting the remaining hosts for this loop 15627 1726882484.51667: getting the next task for host managed_node1 15627 1726882484.51676: done getting next task for host managed_node1 15627 1726882484.51685: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15627 1726882484.51687: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882484.51708: getting variables 15627 1726882484.51711: in VariableManager get_vars() 15627 1726882484.51753: Calling all_inventory to load vars for managed_node1 15627 1726882484.51758: Calling groups_inventory to load vars for managed_node1 15627 1726882484.51761: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882484.51775: Calling all_plugins_play to load vars for managed_node1 15627 1726882484.51778: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882484.51782: Calling groups_plugins_play to load vars for managed_node1 15627 1726882484.53001: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000041 15627 1726882484.53005: WORKER PROCESS EXITING 15627 1726882484.55085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882484.58201: done with get_vars() 15627 1726882484.58251: done getting variables 15627 1726882484.58396: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:34:44 -0400 (0:00:00.164) 0:00:24.336 ****** 15627 1726882484.58447: entering _queue_task() for managed_node1/package 15627 1726882484.58922: worker is 1 (out of 1 available) 15627 1726882484.58941: exiting _queue_task() for managed_node1/package 15627 1726882484.58961: done queuing things up, now waiting for results queue to drain 15627 1726882484.58962: waiting for pending results... 15627 1726882484.59375: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 15627 1726882484.59513: in run() - task 0e448fcc-3ce9-2847-7723-000000000042 15627 1726882484.59533: variable 'ansible_search_path' from source: unknown 15627 1726882484.59541: variable 'ansible_search_path' from source: unknown 15627 1726882484.59586: calling self._execute() 15627 1726882484.59769: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882484.59786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882484.59801: variable 'omit' from source: magic vars 15627 1726882484.60527: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.60545: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882484.60897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882484.61304: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882484.61360: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882484.61417: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882484.61935: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882484.62067: variable 'network_packages' from source: role '' defaults 15627 1726882484.62192: variable '__network_provider_setup' from source: role '' defaults 15627 1726882484.62207: variable '__network_service_name_default_nm' from source: role '' defaults 15627 1726882484.62280: variable '__network_service_name_default_nm' from source: role '' defaults 15627 1726882484.62298: variable '__network_packages_default_nm' from source: role '' defaults 15627 1726882484.62369: variable '__network_packages_default_nm' from source: role '' defaults 15627 1726882484.62642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882484.66490: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882484.66585: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882484.66630: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882484.66693: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882484.66729: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882484.66812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.66862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.66898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.66948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.66970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.67020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.67053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.67090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.67135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.67161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.67404: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15627 1726882484.67525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.67554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.67590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.67641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.67661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.67765: variable 'ansible_python' from source: facts 15627 1726882484.67795: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15627 1726882484.67889: variable '__network_wpa_supplicant_required' from source: role '' defaults 15627 1726882484.67981: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15627 1726882484.68117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.68151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.68187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.68230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.68255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.68311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.68351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.68386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.68431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.68451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.68616: variable 'network_connections' from source: play vars 15627 1726882484.68626: variable 'profile' from source: play vars 15627 1726882484.68737: variable 'profile' from source: play vars 15627 1726882484.68749: variable 'interface' from source: set_fact 15627 1726882484.68918: variable 'interface' from source: set_fact 15627 1726882484.68994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882484.69031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882484.69073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.69108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882484.69166: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882484.69699: variable 'network_connections' from source: play vars 15627 1726882484.69713: variable 'profile' from source: play vars 15627 1726882484.70497: variable 'profile' from source: play vars 15627 1726882484.70514: variable 'interface' from source: set_fact 15627 1726882484.70737: variable 'interface' from source: set_fact 15627 1726882484.70775: variable '__network_packages_default_wireless' from source: role '' defaults 15627 1726882484.70867: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882484.72491: variable 'network_connections' from source: play vars 15627 1726882484.72501: variable 'profile' from source: play vars 15627 1726882484.72603: variable 'profile' from source: play vars 15627 1726882484.72627: variable 'interface' from source: set_fact 15627 1726882484.72738: variable 'interface' from source: set_fact 15627 1726882484.72854: variable '__network_packages_default_team' from source: role '' defaults 15627 1726882484.73053: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882484.73730: variable 'network_connections' from source: play vars 15627 1726882484.73742: variable 'profile' from source: play vars 15627 1726882484.73852: variable 'profile' from source: play vars 15627 1726882484.73897: variable 'interface' from source: set_fact 15627 1726882484.74228: variable 'interface' from source: set_fact 15627 1726882484.74446: variable '__network_service_name_default_initscripts' from source: role '' defaults 15627 1726882484.74588: variable '__network_service_name_default_initscripts' from source: role '' defaults 15627 1726882484.74619: variable '__network_packages_default_initscripts' from source: role '' defaults 15627 1726882484.74709: variable '__network_packages_default_initscripts' from source: role '' defaults 15627 1726882484.74955: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15627 1726882484.75486: variable 'network_connections' from source: play vars 15627 1726882484.75500: variable 'profile' from source: play vars 15627 1726882484.75562: variable 'profile' from source: play vars 15627 1726882484.75574: variable 'interface' from source: set_fact 15627 1726882484.75644: variable 'interface' from source: set_fact 15627 1726882484.75657: variable 'ansible_distribution' from source: facts 15627 1726882484.75671: variable '__network_rh_distros' from source: role '' defaults 15627 1726882484.75681: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.75698: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15627 1726882484.75877: variable 'ansible_distribution' from source: facts 15627 1726882484.75891: variable '__network_rh_distros' from source: role '' defaults 15627 1726882484.75902: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.75918: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15627 1726882484.76093: variable 'ansible_distribution' from source: facts 15627 1726882484.76108: variable '__network_rh_distros' from source: role '' defaults 15627 1726882484.76118: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.76162: variable 'network_provider' from source: set_fact 15627 1726882484.76185: variable 'ansible_facts' from source: unknown 15627 1726882484.77151: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15627 1726882484.77159: when evaluation is False, skipping this task 15627 1726882484.77171: _execute() done 15627 1726882484.77176: dumping result to json 15627 1726882484.77182: done dumping result, returning 15627 1726882484.77193: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-2847-7723-000000000042] 15627 1726882484.77202: sending task result for task 0e448fcc-3ce9-2847-7723-000000000042 skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15627 1726882484.77343: no more pending results, returning what we have 15627 1726882484.77346: results queue empty 15627 1726882484.77347: checking for any_errors_fatal 15627 1726882484.77355: done checking for any_errors_fatal 15627 1726882484.77356: checking for max_fail_percentage 15627 1726882484.77358: done checking for max_fail_percentage 15627 1726882484.77358: checking to see if all hosts have failed and the running result is not ok 15627 1726882484.77359: done checking to see if all hosts have failed 15627 1726882484.77360: getting the remaining hosts for this loop 15627 1726882484.77362: done getting the remaining hosts for this loop 15627 1726882484.77368: getting the next task for host managed_node1 15627 1726882484.77375: done getting next task for host managed_node1 15627 1726882484.77379: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15627 1726882484.77381: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882484.77394: getting variables 15627 1726882484.77396: in VariableManager get_vars() 15627 1726882484.77433: Calling all_inventory to load vars for managed_node1 15627 1726882484.77435: Calling groups_inventory to load vars for managed_node1 15627 1726882484.77438: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882484.77447: Calling all_plugins_play to load vars for managed_node1 15627 1726882484.77455: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882484.77459: Calling groups_plugins_play to load vars for managed_node1 15627 1726882484.78482: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000042 15627 1726882484.78486: WORKER PROCESS EXITING 15627 1726882484.79295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882484.81065: done with get_vars() 15627 1726882484.81087: done getting variables 15627 1726882484.81145: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:34:44 -0400 (0:00:00.227) 0:00:24.563 ****** 15627 1726882484.81177: entering _queue_task() for managed_node1/package 15627 1726882484.81435: worker is 1 (out of 1 available) 15627 1726882484.81449: exiting _queue_task() for managed_node1/package 15627 1726882484.81462: done queuing things up, now waiting for results queue to drain 15627 1726882484.81466: waiting for pending results... 15627 1726882484.81747: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15627 1726882484.81861: in run() - task 0e448fcc-3ce9-2847-7723-000000000043 15627 1726882484.81884: variable 'ansible_search_path' from source: unknown 15627 1726882484.81893: variable 'ansible_search_path' from source: unknown 15627 1726882484.81934: calling self._execute() 15627 1726882484.82040: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882484.82055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882484.82072: variable 'omit' from source: magic vars 15627 1726882484.82457: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.82476: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882484.82604: variable 'network_state' from source: role '' defaults 15627 1726882484.82620: Evaluated conditional (network_state != {}): False 15627 1726882484.82627: when evaluation is False, skipping this task 15627 1726882484.82633: _execute() done 15627 1726882484.82640: dumping result to json 15627 1726882484.82647: done dumping result, returning 15627 1726882484.82657: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-2847-7723-000000000043] 15627 1726882484.82674: sending task result for task 0e448fcc-3ce9-2847-7723-000000000043 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882484.82822: no more pending results, returning what we have 15627 1726882484.82826: results queue empty 15627 1726882484.82826: checking for any_errors_fatal 15627 1726882484.82833: done checking for any_errors_fatal 15627 1726882484.82834: checking for max_fail_percentage 15627 1726882484.82836: done checking for max_fail_percentage 15627 1726882484.82836: checking to see if all hosts have failed and the running result is not ok 15627 1726882484.82838: done checking to see if all hosts have failed 15627 1726882484.82838: getting the remaining hosts for this loop 15627 1726882484.82840: done getting the remaining hosts for this loop 15627 1726882484.82844: getting the next task for host managed_node1 15627 1726882484.82852: done getting next task for host managed_node1 15627 1726882484.82856: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15627 1726882484.82858: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882484.82874: getting variables 15627 1726882484.82876: in VariableManager get_vars() 15627 1726882484.82913: Calling all_inventory to load vars for managed_node1 15627 1726882484.82917: Calling groups_inventory to load vars for managed_node1 15627 1726882484.82919: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882484.82931: Calling all_plugins_play to load vars for managed_node1 15627 1726882484.82935: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882484.82938: Calling groups_plugins_play to load vars for managed_node1 15627 1726882484.83982: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000043 15627 1726882484.83986: WORKER PROCESS EXITING 15627 1726882484.84654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882484.86295: done with get_vars() 15627 1726882484.86317: done getting variables 15627 1726882484.86375: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:34:44 -0400 (0:00:00.052) 0:00:24.615 ****** 15627 1726882484.86408: entering _queue_task() for managed_node1/package 15627 1726882484.86695: worker is 1 (out of 1 available) 15627 1726882484.86708: exiting _queue_task() for managed_node1/package 15627 1726882484.86720: done queuing things up, now waiting for results queue to drain 15627 1726882484.86721: waiting for pending results... 15627 1726882484.86990: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15627 1726882484.87095: in run() - task 0e448fcc-3ce9-2847-7723-000000000044 15627 1726882484.87114: variable 'ansible_search_path' from source: unknown 15627 1726882484.87122: variable 'ansible_search_path' from source: unknown 15627 1726882484.87162: calling self._execute() 15627 1726882484.87255: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882484.87270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882484.87290: variable 'omit' from source: magic vars 15627 1726882484.87640: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.87657: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882484.87778: variable 'network_state' from source: role '' defaults 15627 1726882484.87795: Evaluated conditional (network_state != {}): False 15627 1726882484.87803: when evaluation is False, skipping this task 15627 1726882484.87810: _execute() done 15627 1726882484.87819: dumping result to json 15627 1726882484.87830: done dumping result, returning 15627 1726882484.87841: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-2847-7723-000000000044] 15627 1726882484.87852: sending task result for task 0e448fcc-3ce9-2847-7723-000000000044 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882484.87994: no more pending results, returning what we have 15627 1726882484.87998: results queue empty 15627 1726882484.87998: checking for any_errors_fatal 15627 1726882484.88007: done checking for any_errors_fatal 15627 1726882484.88008: checking for max_fail_percentage 15627 1726882484.88010: done checking for max_fail_percentage 15627 1726882484.88011: checking to see if all hosts have failed and the running result is not ok 15627 1726882484.88012: done checking to see if all hosts have failed 15627 1726882484.88012: getting the remaining hosts for this loop 15627 1726882484.88015: done getting the remaining hosts for this loop 15627 1726882484.88018: getting the next task for host managed_node1 15627 1726882484.88026: done getting next task for host managed_node1 15627 1726882484.88029: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15627 1726882484.88031: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882484.88045: getting variables 15627 1726882484.88046: in VariableManager get_vars() 15627 1726882484.88085: Calling all_inventory to load vars for managed_node1 15627 1726882484.88088: Calling groups_inventory to load vars for managed_node1 15627 1726882484.88090: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882484.88102: Calling all_plugins_play to load vars for managed_node1 15627 1726882484.88106: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882484.88109: Calling groups_plugins_play to load vars for managed_node1 15627 1726882484.89082: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000044 15627 1726882484.89086: WORKER PROCESS EXITING 15627 1726882484.89791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882484.91572: done with get_vars() 15627 1726882484.91598: done getting variables 15627 1726882484.91665: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:34:44 -0400 (0:00:00.052) 0:00:24.668 ****** 15627 1726882484.91697: entering _queue_task() for managed_node1/service 15627 1726882484.91990: worker is 1 (out of 1 available) 15627 1726882484.92001: exiting _queue_task() for managed_node1/service 15627 1726882484.92013: done queuing things up, now waiting for results queue to drain 15627 1726882484.92015: waiting for pending results... 15627 1726882484.92292: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15627 1726882484.92415: in run() - task 0e448fcc-3ce9-2847-7723-000000000045 15627 1726882484.92436: variable 'ansible_search_path' from source: unknown 15627 1726882484.92446: variable 'ansible_search_path' from source: unknown 15627 1726882484.92494: calling self._execute() 15627 1726882484.92596: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882484.92610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882484.92625: variable 'omit' from source: magic vars 15627 1726882484.93004: variable 'ansible_distribution_major_version' from source: facts 15627 1726882484.93024: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882484.93149: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882484.93353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882484.95977: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882484.96057: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882484.96100: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882484.96191: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882484.96293: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882484.96484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.96518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.96547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.96708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.96728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.96776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.96918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.96947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.96993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.97011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.97056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882484.97157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882484.97189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.97282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882484.97368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882484.97656: variable 'network_connections' from source: play vars 15627 1726882484.97793: variable 'profile' from source: play vars 15627 1726882484.97866: variable 'profile' from source: play vars 15627 1726882484.97895: variable 'interface' from source: set_fact 15627 1726882484.97960: variable 'interface' from source: set_fact 15627 1726882484.98176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882484.98570: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882484.98609: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882484.98769: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882484.98804: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882484.98848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882484.98880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882484.98998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882484.99028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882484.99193: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882484.99674: variable 'network_connections' from source: play vars 15627 1726882484.99684: variable 'profile' from source: play vars 15627 1726882484.99858: variable 'profile' from source: play vars 15627 1726882484.99871: variable 'interface' from source: set_fact 15627 1726882484.99927: variable 'interface' from source: set_fact 15627 1726882484.99957: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15627 1726882485.00065: when evaluation is False, skipping this task 15627 1726882485.00073: _execute() done 15627 1726882485.00078: dumping result to json 15627 1726882485.00084: done dumping result, returning 15627 1726882485.00093: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-2847-7723-000000000045] 15627 1726882485.00108: sending task result for task 0e448fcc-3ce9-2847-7723-000000000045 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15627 1726882485.00257: no more pending results, returning what we have 15627 1726882485.00261: results queue empty 15627 1726882485.00262: checking for any_errors_fatal 15627 1726882485.00270: done checking for any_errors_fatal 15627 1726882485.00271: checking for max_fail_percentage 15627 1726882485.00273: done checking for max_fail_percentage 15627 1726882485.00274: checking to see if all hosts have failed and the running result is not ok 15627 1726882485.00275: done checking to see if all hosts have failed 15627 1726882485.00276: getting the remaining hosts for this loop 15627 1726882485.00277: done getting the remaining hosts for this loop 15627 1726882485.00281: getting the next task for host managed_node1 15627 1726882485.00289: done getting next task for host managed_node1 15627 1726882485.00293: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15627 1726882485.00295: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882485.00309: getting variables 15627 1726882485.00311: in VariableManager get_vars() 15627 1726882485.00349: Calling all_inventory to load vars for managed_node1 15627 1726882485.00352: Calling groups_inventory to load vars for managed_node1 15627 1726882485.00356: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882485.00367: Calling all_plugins_play to load vars for managed_node1 15627 1726882485.00371: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882485.00374: Calling groups_plugins_play to load vars for managed_node1 15627 1726882485.01420: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000045 15627 1726882485.01423: WORKER PROCESS EXITING 15627 1726882485.03511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882485.06925: done with get_vars() 15627 1726882485.06952: done getting variables 15627 1726882485.07013: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:34:45 -0400 (0:00:00.153) 0:00:24.822 ****** 15627 1726882485.07043: entering _queue_task() for managed_node1/service 15627 1726882485.08061: worker is 1 (out of 1 available) 15627 1726882485.08076: exiting _queue_task() for managed_node1/service 15627 1726882485.08087: done queuing things up, now waiting for results queue to drain 15627 1726882485.08089: waiting for pending results... 15627 1726882485.08579: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15627 1726882485.08801: in run() - task 0e448fcc-3ce9-2847-7723-000000000046 15627 1726882485.08819: variable 'ansible_search_path' from source: unknown 15627 1726882485.08859: variable 'ansible_search_path' from source: unknown 15627 1726882485.08900: calling self._execute() 15627 1726882485.09053: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882485.09188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882485.09202: variable 'omit' from source: magic vars 15627 1726882485.09910: variable 'ansible_distribution_major_version' from source: facts 15627 1726882485.09953: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882485.10287: variable 'network_provider' from source: set_fact 15627 1726882485.10371: variable 'network_state' from source: role '' defaults 15627 1726882485.10388: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15627 1726882485.10406: variable 'omit' from source: magic vars 15627 1726882485.10454: variable 'omit' from source: magic vars 15627 1726882485.10616: variable 'network_service_name' from source: role '' defaults 15627 1726882485.10685: variable 'network_service_name' from source: role '' defaults 15627 1726882485.10797: variable '__network_provider_setup' from source: role '' defaults 15627 1726882485.10921: variable '__network_service_name_default_nm' from source: role '' defaults 15627 1726882485.10988: variable '__network_service_name_default_nm' from source: role '' defaults 15627 1726882485.11140: variable '__network_packages_default_nm' from source: role '' defaults 15627 1726882485.11205: variable '__network_packages_default_nm' from source: role '' defaults 15627 1726882485.11658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882485.16181: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882485.16260: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882485.16374: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882485.16481: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882485.16511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882485.16626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882485.16801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882485.16831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882485.16992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882485.17012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882485.17056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882485.17088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882485.17119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882485.17246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882485.17267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882485.17740: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15627 1726882485.18080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882485.18109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882485.18139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882485.18298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882485.18317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882485.18523: variable 'ansible_python' from source: facts 15627 1726882485.18550: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15627 1726882485.18750: variable '__network_wpa_supplicant_required' from source: role '' defaults 15627 1726882485.18839: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15627 1726882485.19180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882485.19209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882485.19237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882485.19285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882485.19388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882485.19438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882485.19506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882485.19616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882485.19658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882485.19681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882485.20059: variable 'network_connections' from source: play vars 15627 1726882485.20074: variable 'profile' from source: play vars 15627 1726882485.20262: variable 'profile' from source: play vars 15627 1726882485.20277: variable 'interface' from source: set_fact 15627 1726882485.20339: variable 'interface' from source: set_fact 15627 1726882485.20568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882485.21354: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882485.21407: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882485.21458: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882485.21598: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882485.21777: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882485.21812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882485.21849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882485.22002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882485.22052: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882485.22783: variable 'network_connections' from source: play vars 15627 1726882485.22795: variable 'profile' from source: play vars 15627 1726882485.22985: variable 'profile' from source: play vars 15627 1726882485.22996: variable 'interface' from source: set_fact 15627 1726882485.23057: variable 'interface' from source: set_fact 15627 1726882485.23430: variable '__network_packages_default_wireless' from source: role '' defaults 15627 1726882485.23584: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882485.24560: variable 'network_connections' from source: play vars 15627 1726882485.24572: variable 'profile' from source: play vars 15627 1726882485.24656: variable 'profile' from source: play vars 15627 1726882485.24923: variable 'interface' from source: set_fact 15627 1726882485.24995: variable 'interface' from source: set_fact 15627 1726882485.25271: variable '__network_packages_default_team' from source: role '' defaults 15627 1726882485.25356: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882485.26313: variable 'network_connections' from source: play vars 15627 1726882485.26324: variable 'profile' from source: play vars 15627 1726882485.26510: variable 'profile' from source: play vars 15627 1726882485.26670: variable 'interface' from source: set_fact 15627 1726882485.26746: variable 'interface' from source: set_fact 15627 1726882485.27037: variable '__network_service_name_default_initscripts' from source: role '' defaults 15627 1726882485.27324: variable '__network_service_name_default_initscripts' from source: role '' defaults 15627 1726882485.27335: variable '__network_packages_default_initscripts' from source: role '' defaults 15627 1726882485.27398: variable '__network_packages_default_initscripts' from source: role '' defaults 15627 1726882485.28189: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15627 1726882485.29723: variable 'network_connections' from source: play vars 15627 1726882485.30679: variable 'profile' from source: play vars 15627 1726882485.30737: variable 'profile' from source: play vars 15627 1726882485.30740: variable 'interface' from source: set_fact 15627 1726882485.30811: variable 'interface' from source: set_fact 15627 1726882485.30819: variable 'ansible_distribution' from source: facts 15627 1726882485.30822: variable '__network_rh_distros' from source: role '' defaults 15627 1726882485.30828: variable 'ansible_distribution_major_version' from source: facts 15627 1726882485.30842: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15627 1726882485.31221: variable 'ansible_distribution' from source: facts 15627 1726882485.31226: variable '__network_rh_distros' from source: role '' defaults 15627 1726882485.31228: variable 'ansible_distribution_major_version' from source: facts 15627 1726882485.31243: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15627 1726882485.31918: variable 'ansible_distribution' from source: facts 15627 1726882485.31922: variable '__network_rh_distros' from source: role '' defaults 15627 1726882485.31927: variable 'ansible_distribution_major_version' from source: facts 15627 1726882485.32172: variable 'network_provider' from source: set_fact 15627 1726882485.32197: variable 'omit' from source: magic vars 15627 1726882485.32225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882485.32252: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882485.32276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882485.32294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882485.32304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882485.32333: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882485.32337: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882485.32339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882485.32846: Set connection var ansible_timeout to 10 15627 1726882485.32854: Set connection var ansible_shell_executable to /bin/sh 15627 1726882485.32864: Set connection var ansible_connection to ssh 15627 1726882485.32870: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882485.32876: Set connection var ansible_pipelining to False 15627 1726882485.32878: Set connection var ansible_shell_type to sh 15627 1726882485.32905: variable 'ansible_shell_executable' from source: unknown 15627 1726882485.32909: variable 'ansible_connection' from source: unknown 15627 1726882485.32912: variable 'ansible_module_compression' from source: unknown 15627 1726882485.32914: variable 'ansible_shell_type' from source: unknown 15627 1726882485.32916: variable 'ansible_shell_executable' from source: unknown 15627 1726882485.32918: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882485.32925: variable 'ansible_pipelining' from source: unknown 15627 1726882485.32927: variable 'ansible_timeout' from source: unknown 15627 1726882485.32929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882485.33436: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882485.33446: variable 'omit' from source: magic vars 15627 1726882485.33451: starting attempt loop 15627 1726882485.33454: running the handler 15627 1726882485.33534: variable 'ansible_facts' from source: unknown 15627 1726882485.36381: _low_level_execute_command(): starting 15627 1726882485.36387: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882485.38320: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882485.38326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882485.38475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882485.38481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882485.38556: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882485.38568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882485.38571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882485.38645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882485.38775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882485.38778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882485.38906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882485.40592: stdout chunk (state=3): >>>/root <<< 15627 1726882485.40755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882485.40765: stderr chunk (state=3): >>><<< 15627 1726882485.40770: stdout chunk (state=3): >>><<< 15627 1726882485.40792: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882485.40804: _low_level_execute_command(): starting 15627 1726882485.40810: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882485.407916-16684-192515301845253 `" && echo ansible-tmp-1726882485.407916-16684-192515301845253="` echo /root/.ansible/tmp/ansible-tmp-1726882485.407916-16684-192515301845253 `" ) && sleep 0' 15627 1726882485.43056: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882485.43070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882485.43081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882485.43095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882485.43133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882485.43140: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882485.43150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882485.43172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882485.43274: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882485.43281: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882485.43290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882485.43299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882485.43311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882485.43319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882485.43325: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882485.43334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882485.43413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882485.43496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882485.43506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882485.43711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882485.45700: stdout chunk (state=3): >>>ansible-tmp-1726882485.407916-16684-192515301845253=/root/.ansible/tmp/ansible-tmp-1726882485.407916-16684-192515301845253 <<< 15627 1726882485.45869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882485.45873: stdout chunk (state=3): >>><<< 15627 1726882485.45881: stderr chunk (state=3): >>><<< 15627 1726882485.45896: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882485.407916-16684-192515301845253=/root/.ansible/tmp/ansible-tmp-1726882485.407916-16684-192515301845253 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882485.45929: variable 'ansible_module_compression' from source: unknown 15627 1726882485.45985: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15627 1726882485.46041: variable 'ansible_facts' from source: unknown 15627 1726882485.46199: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882485.407916-16684-192515301845253/AnsiballZ_systemd.py 15627 1726882485.46741: Sending initial data 15627 1726882485.46744: Sent initial data (155 bytes) 15627 1726882485.50016: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882485.50025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882485.50036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882485.50051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882485.50094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882485.50101: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882485.50111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882485.50124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882485.50131: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882485.50138: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882485.50146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882485.50155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882485.50173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882485.50179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882485.50186: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882485.50195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882485.50272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882485.50783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882485.50794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882485.51188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882485.53036: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882485.53134: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882485.53226: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpipcbemkw /root/.ansible/tmp/ansible-tmp-1726882485.407916-16684-192515301845253/AnsiballZ_systemd.py <<< 15627 1726882485.53332: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882485.56340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882485.56494: stderr chunk (state=3): >>><<< 15627 1726882485.56498: stdout chunk (state=3): >>><<< 15627 1726882485.56599: done transferring module to remote 15627 1726882485.56602: _low_level_execute_command(): starting 15627 1726882485.56605: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882485.407916-16684-192515301845253/ /root/.ansible/tmp/ansible-tmp-1726882485.407916-16684-192515301845253/AnsiballZ_systemd.py && sleep 0' 15627 1726882485.57587: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882485.58073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882485.58089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882485.58107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882485.58147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882485.58578: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882485.58592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882485.58610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882485.58621: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882485.58630: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882485.58641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882485.58653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882485.58673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882485.58684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882485.58694: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882485.58709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882485.58785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882485.58805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882485.58820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882485.58946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882485.60857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882485.60861: stdout chunk (state=3): >>><<< 15627 1726882485.60865: stderr chunk (state=3): >>><<< 15627 1726882485.60946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882485.60953: _low_level_execute_command(): starting 15627 1726882485.60955: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882485.407916-16684-192515301845253/AnsiballZ_systemd.py && sleep 0' 15627 1726882485.62679: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882485.62720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882485.62760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882485.62869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882485.88012: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 15627 1726882485.88062: stdout chunk (state=3): >>>service", "ControlGroupId": "2455", "MemoryCurrent": "16101376", "MemoryAvailable": "infinity", "CPUUsageNSec": "799199000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSi<<< 15627 1726882485.88069: stdout chunk (state=3): >>>gnal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15627 1726882485.89544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882485.89631: stderr chunk (state=3): >>><<< 15627 1726882485.89635: stdout chunk (state=3): >>><<< 15627 1726882485.89670: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16101376", "MemoryAvailable": "infinity", "CPUUsageNSec": "799199000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882485.89900: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882485.407916-16684-192515301845253/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882485.89903: _low_level_execute_command(): starting 15627 1726882485.89905: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882485.407916-16684-192515301845253/ > /dev/null 2>&1 && sleep 0' 15627 1726882485.91513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882485.91526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882485.91539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882485.91555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882485.91597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882485.91609: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882485.91621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882485.91637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882485.91648: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882485.91657: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882485.91674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882485.91688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882485.91704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882485.91780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882485.91791: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882485.91809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882485.91882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882485.91988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882485.92004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882485.92125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882485.93981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882485.94037: stderr chunk (state=3): >>><<< 15627 1726882485.94040: stdout chunk (state=3): >>><<< 15627 1726882485.94172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882485.94175: handler run complete 15627 1726882485.94178: attempt loop complete, returning result 15627 1726882485.94180: _execute() done 15627 1726882485.94182: dumping result to json 15627 1726882485.94184: done dumping result, returning 15627 1726882485.94186: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-2847-7723-000000000046] 15627 1726882485.94188: sending task result for task 0e448fcc-3ce9-2847-7723-000000000046 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882485.94524: no more pending results, returning what we have 15627 1726882485.94527: results queue empty 15627 1726882485.94528: checking for any_errors_fatal 15627 1726882485.94534: done checking for any_errors_fatal 15627 1726882485.94535: checking for max_fail_percentage 15627 1726882485.94537: done checking for max_fail_percentage 15627 1726882485.94537: checking to see if all hosts have failed and the running result is not ok 15627 1726882485.94538: done checking to see if all hosts have failed 15627 1726882485.94539: getting the remaining hosts for this loop 15627 1726882485.94541: done getting the remaining hosts for this loop 15627 1726882485.94544: getting the next task for host managed_node1 15627 1726882485.94550: done getting next task for host managed_node1 15627 1726882485.94557: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15627 1726882485.94559: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882485.94572: getting variables 15627 1726882485.94573: in VariableManager get_vars() 15627 1726882485.94607: Calling all_inventory to load vars for managed_node1 15627 1726882485.94610: Calling groups_inventory to load vars for managed_node1 15627 1726882485.94612: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882485.94618: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000046 15627 1726882485.94621: WORKER PROCESS EXITING 15627 1726882485.94638: Calling all_plugins_play to load vars for managed_node1 15627 1726882485.94643: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882485.94646: Calling groups_plugins_play to load vars for managed_node1 15627 1726882485.97972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882486.02186: done with get_vars() 15627 1726882486.02328: done getting variables 15627 1726882486.02395: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:34:46 -0400 (0:00:00.955) 0:00:25.777 ****** 15627 1726882486.02568: entering _queue_task() for managed_node1/service 15627 1726882486.03339: worker is 1 (out of 1 available) 15627 1726882486.03350: exiting _queue_task() for managed_node1/service 15627 1726882486.03366: done queuing things up, now waiting for results queue to drain 15627 1726882486.03368: waiting for pending results... 15627 1726882486.04138: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15627 1726882486.04242: in run() - task 0e448fcc-3ce9-2847-7723-000000000047 15627 1726882486.04432: variable 'ansible_search_path' from source: unknown 15627 1726882486.04442: variable 'ansible_search_path' from source: unknown 15627 1726882486.04494: calling self._execute() 15627 1726882486.04733: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882486.04813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882486.04836: variable 'omit' from source: magic vars 15627 1726882486.05672: variable 'ansible_distribution_major_version' from source: facts 15627 1726882486.05885: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882486.06232: variable 'network_provider' from source: set_fact 15627 1726882486.06281: Evaluated conditional (network_provider == "nm"): True 15627 1726882486.06620: variable '__network_wpa_supplicant_required' from source: role '' defaults 15627 1726882486.06927: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15627 1726882486.07367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882486.12422: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882486.12510: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882486.12753: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882486.12796: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882486.12827: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882486.13051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882486.13169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882486.13201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882486.13410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882486.13439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882486.13488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882486.13634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882486.13665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882486.13777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882486.13847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882486.13892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882486.14005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882486.14050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882486.14317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882486.14340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882486.14486: variable 'network_connections' from source: play vars 15627 1726882486.14502: variable 'profile' from source: play vars 15627 1726882486.14819: variable 'profile' from source: play vars 15627 1726882486.14839: variable 'interface' from source: set_fact 15627 1726882486.14927: variable 'interface' from source: set_fact 15627 1726882486.15111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882486.15480: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882486.15520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882486.15577: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882486.15616: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882486.15667: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882486.15719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882486.15760: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882486.15793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882486.15843: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882486.16156: variable 'network_connections' from source: play vars 15627 1726882486.16187: variable 'profile' from source: play vars 15627 1726882486.16352: variable 'profile' from source: play vars 15627 1726882486.16363: variable 'interface' from source: set_fact 15627 1726882486.16553: variable 'interface' from source: set_fact 15627 1726882486.16588: Evaluated conditional (__network_wpa_supplicant_required): False 15627 1726882486.16596: when evaluation is False, skipping this task 15627 1726882486.16602: _execute() done 15627 1726882486.16616: dumping result to json 15627 1726882486.16622: done dumping result, returning 15627 1726882486.16633: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-2847-7723-000000000047] 15627 1726882486.16644: sending task result for task 0e448fcc-3ce9-2847-7723-000000000047 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15627 1726882486.16810: no more pending results, returning what we have 15627 1726882486.16813: results queue empty 15627 1726882486.16814: checking for any_errors_fatal 15627 1726882486.16832: done checking for any_errors_fatal 15627 1726882486.16832: checking for max_fail_percentage 15627 1726882486.16836: done checking for max_fail_percentage 15627 1726882486.16837: checking to see if all hosts have failed and the running result is not ok 15627 1726882486.16838: done checking to see if all hosts have failed 15627 1726882486.16838: getting the remaining hosts for this loop 15627 1726882486.16840: done getting the remaining hosts for this loop 15627 1726882486.16844: getting the next task for host managed_node1 15627 1726882486.16852: done getting next task for host managed_node1 15627 1726882486.16858: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15627 1726882486.16860: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882486.16881: getting variables 15627 1726882486.16883: in VariableManager get_vars() 15627 1726882486.16932: Calling all_inventory to load vars for managed_node1 15627 1726882486.16934: Calling groups_inventory to load vars for managed_node1 15627 1726882486.16936: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882486.16947: Calling all_plugins_play to load vars for managed_node1 15627 1726882486.16950: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882486.16953: Calling groups_plugins_play to load vars for managed_node1 15627 1726882486.18267: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000047 15627 1726882486.18282: WORKER PROCESS EXITING 15627 1726882486.19945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882486.22174: done with get_vars() 15627 1726882486.22202: done getting variables 15627 1726882486.22267: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:34:46 -0400 (0:00:00.197) 0:00:25.974 ****** 15627 1726882486.22304: entering _queue_task() for managed_node1/service 15627 1726882486.22640: worker is 1 (out of 1 available) 15627 1726882486.22653: exiting _queue_task() for managed_node1/service 15627 1726882486.22670: done queuing things up, now waiting for results queue to drain 15627 1726882486.22672: waiting for pending results... 15627 1726882486.23042: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 15627 1726882486.23146: in run() - task 0e448fcc-3ce9-2847-7723-000000000048 15627 1726882486.23165: variable 'ansible_search_path' from source: unknown 15627 1726882486.23169: variable 'ansible_search_path' from source: unknown 15627 1726882486.23211: calling self._execute() 15627 1726882486.23338: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882486.23344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882486.23353: variable 'omit' from source: magic vars 15627 1726882486.23967: variable 'ansible_distribution_major_version' from source: facts 15627 1726882486.23975: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882486.24116: variable 'network_provider' from source: set_fact 15627 1726882486.24121: Evaluated conditional (network_provider == "initscripts"): False 15627 1726882486.24124: when evaluation is False, skipping this task 15627 1726882486.24127: _execute() done 15627 1726882486.24130: dumping result to json 15627 1726882486.24133: done dumping result, returning 15627 1726882486.24138: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-2847-7723-000000000048] 15627 1726882486.24146: sending task result for task 0e448fcc-3ce9-2847-7723-000000000048 15627 1726882486.24401: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000048 15627 1726882486.24404: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882486.24516: no more pending results, returning what we have 15627 1726882486.24521: results queue empty 15627 1726882486.24525: checking for any_errors_fatal 15627 1726882486.24535: done checking for any_errors_fatal 15627 1726882486.24536: checking for max_fail_percentage 15627 1726882486.24538: done checking for max_fail_percentage 15627 1726882486.24539: checking to see if all hosts have failed and the running result is not ok 15627 1726882486.24540: done checking to see if all hosts have failed 15627 1726882486.24541: getting the remaining hosts for this loop 15627 1726882486.24543: done getting the remaining hosts for this loop 15627 1726882486.24547: getting the next task for host managed_node1 15627 1726882486.24556: done getting next task for host managed_node1 15627 1726882486.24562: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15627 1726882486.24569: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882486.24589: getting variables 15627 1726882486.24592: in VariableManager get_vars() 15627 1726882486.24645: Calling all_inventory to load vars for managed_node1 15627 1726882486.24648: Calling groups_inventory to load vars for managed_node1 15627 1726882486.24651: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882486.24792: Calling all_plugins_play to load vars for managed_node1 15627 1726882486.24797: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882486.24801: Calling groups_plugins_play to load vars for managed_node1 15627 1726882486.28153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882486.30886: done with get_vars() 15627 1726882486.30918: done getting variables 15627 1726882486.31024: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:34:46 -0400 (0:00:00.087) 0:00:26.062 ****** 15627 1726882486.31070: entering _queue_task() for managed_node1/copy 15627 1726882486.31555: worker is 1 (out of 1 available) 15627 1726882486.31570: exiting _queue_task() for managed_node1/copy 15627 1726882486.31584: done queuing things up, now waiting for results queue to drain 15627 1726882486.31586: waiting for pending results... 15627 1726882486.31937: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15627 1726882486.32222: in run() - task 0e448fcc-3ce9-2847-7723-000000000049 15627 1726882486.32226: variable 'ansible_search_path' from source: unknown 15627 1726882486.32230: variable 'ansible_search_path' from source: unknown 15627 1726882486.32233: calling self._execute() 15627 1726882486.32236: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882486.32239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882486.32242: variable 'omit' from source: magic vars 15627 1726882486.33956: variable 'ansible_distribution_major_version' from source: facts 15627 1726882486.33960: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882486.33964: variable 'network_provider' from source: set_fact 15627 1726882486.33967: Evaluated conditional (network_provider == "initscripts"): False 15627 1726882486.33970: when evaluation is False, skipping this task 15627 1726882486.33972: _execute() done 15627 1726882486.33974: dumping result to json 15627 1726882486.33976: done dumping result, returning 15627 1726882486.33979: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-2847-7723-000000000049] 15627 1726882486.33981: sending task result for task 0e448fcc-3ce9-2847-7723-000000000049 skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15627 1726882486.34118: no more pending results, returning what we have 15627 1726882486.34121: results queue empty 15627 1726882486.34121: checking for any_errors_fatal 15627 1726882486.34125: done checking for any_errors_fatal 15627 1726882486.34126: checking for max_fail_percentage 15627 1726882486.34128: done checking for max_fail_percentage 15627 1726882486.34129: checking to see if all hosts have failed and the running result is not ok 15627 1726882486.34130: done checking to see if all hosts have failed 15627 1726882486.34130: getting the remaining hosts for this loop 15627 1726882486.34132: done getting the remaining hosts for this loop 15627 1726882486.34134: getting the next task for host managed_node1 15627 1726882486.34140: done getting next task for host managed_node1 15627 1726882486.34144: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15627 1726882486.34146: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882486.34161: getting variables 15627 1726882486.34163: in VariableManager get_vars() 15627 1726882486.34202: Calling all_inventory to load vars for managed_node1 15627 1726882486.34205: Calling groups_inventory to load vars for managed_node1 15627 1726882486.34208: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882486.34216: Calling all_plugins_play to load vars for managed_node1 15627 1726882486.34219: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882486.34222: Calling groups_plugins_play to load vars for managed_node1 15627 1726882486.35362: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000049 15627 1726882486.35370: WORKER PROCESS EXITING 15627 1726882486.36949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882486.39636: done with get_vars() 15627 1726882486.39760: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:34:46 -0400 (0:00:00.089) 0:00:26.152 ****** 15627 1726882486.40067: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15627 1726882486.40359: worker is 1 (out of 1 available) 15627 1726882486.40372: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15627 1726882486.41076: done queuing things up, now waiting for results queue to drain 15627 1726882486.41078: waiting for pending results... 15627 1726882486.41321: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15627 1726882486.41542: in run() - task 0e448fcc-3ce9-2847-7723-00000000004a 15627 1726882486.41555: variable 'ansible_search_path' from source: unknown 15627 1726882486.41559: variable 'ansible_search_path' from source: unknown 15627 1726882486.41718: calling self._execute() 15627 1726882486.41941: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882486.41948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882486.41959: variable 'omit' from source: magic vars 15627 1726882486.43374: variable 'ansible_distribution_major_version' from source: facts 15627 1726882486.43482: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882486.43631: variable 'omit' from source: magic vars 15627 1726882486.43748: variable 'omit' from source: magic vars 15627 1726882486.44509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882486.47971: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882486.48034: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882486.48077: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882486.48131: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882486.48165: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882486.48247: variable 'network_provider' from source: set_fact 15627 1726882486.49136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882486.49299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882486.49389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882486.49615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882486.49733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882486.49962: variable 'omit' from source: magic vars 15627 1726882486.50492: variable 'omit' from source: magic vars 15627 1726882486.50786: variable 'network_connections' from source: play vars 15627 1726882486.50797: variable 'profile' from source: play vars 15627 1726882486.51045: variable 'profile' from source: play vars 15627 1726882486.51133: variable 'interface' from source: set_fact 15627 1726882486.51191: variable 'interface' from source: set_fact 15627 1726882486.51621: variable 'omit' from source: magic vars 15627 1726882486.51631: variable '__lsr_ansible_managed' from source: task vars 15627 1726882486.51898: variable '__lsr_ansible_managed' from source: task vars 15627 1726882486.52289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15627 1726882486.52676: Loaded config def from plugin (lookup/template) 15627 1726882486.52786: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15627 1726882486.52817: File lookup term: get_ansible_managed.j2 15627 1726882486.52867: variable 'ansible_search_path' from source: unknown 15627 1726882486.52956: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15627 1726882486.53105: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15627 1726882486.53113: variable 'ansible_search_path' from source: unknown 15627 1726882486.64005: variable 'ansible_managed' from source: unknown 15627 1726882486.64312: variable 'omit' from source: magic vars 15627 1726882486.64350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882486.64532: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882486.64535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882486.64566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882486.64570: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882486.64622: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882486.64673: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882486.64677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882486.64897: Set connection var ansible_timeout to 10 15627 1726882486.64905: Set connection var ansible_shell_executable to /bin/sh 15627 1726882486.64910: Set connection var ansible_connection to ssh 15627 1726882486.64916: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882486.64921: Set connection var ansible_pipelining to False 15627 1726882486.64923: Set connection var ansible_shell_type to sh 15627 1726882486.65077: variable 'ansible_shell_executable' from source: unknown 15627 1726882486.65081: variable 'ansible_connection' from source: unknown 15627 1726882486.65085: variable 'ansible_module_compression' from source: unknown 15627 1726882486.65088: variable 'ansible_shell_type' from source: unknown 15627 1726882486.65090: variable 'ansible_shell_executable' from source: unknown 15627 1726882486.65092: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882486.65094: variable 'ansible_pipelining' from source: unknown 15627 1726882486.65096: variable 'ansible_timeout' from source: unknown 15627 1726882486.65098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882486.65341: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882486.65352: variable 'omit' from source: magic vars 15627 1726882486.65354: starting attempt loop 15627 1726882486.65361: running the handler 15627 1726882486.65392: _low_level_execute_command(): starting 15627 1726882486.65399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882486.66910: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882486.66932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882486.66989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882486.66992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882486.67021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 15627 1726882486.67029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882486.67034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882486.67051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882486.67130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882486.67173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882486.67177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882486.67324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882486.69012: stdout chunk (state=3): >>>/root <<< 15627 1726882486.69096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882486.69260: stderr chunk (state=3): >>><<< 15627 1726882486.69265: stdout chunk (state=3): >>><<< 15627 1726882486.69371: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882486.69374: _low_level_execute_command(): starting 15627 1726882486.69378: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882486.6928527-16727-182561179751028 `" && echo ansible-tmp-1726882486.6928527-16727-182561179751028="` echo /root/.ansible/tmp/ansible-tmp-1726882486.6928527-16727-182561179751028 `" ) && sleep 0' 15627 1726882486.71119: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882486.71122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882486.71163: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882486.71167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882486.71169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882486.71171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882486.71346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882486.71420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882486.71633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882486.73442: stdout chunk (state=3): >>>ansible-tmp-1726882486.6928527-16727-182561179751028=/root/.ansible/tmp/ansible-tmp-1726882486.6928527-16727-182561179751028 <<< 15627 1726882486.73558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882486.73629: stderr chunk (state=3): >>><<< 15627 1726882486.73632: stdout chunk (state=3): >>><<< 15627 1726882486.73773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882486.6928527-16727-182561179751028=/root/.ansible/tmp/ansible-tmp-1726882486.6928527-16727-182561179751028 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882486.73776: variable 'ansible_module_compression' from source: unknown 15627 1726882486.73778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15627 1726882486.73780: variable 'ansible_facts' from source: unknown 15627 1726882486.73859: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882486.6928527-16727-182561179751028/AnsiballZ_network_connections.py 15627 1726882486.74488: Sending initial data 15627 1726882486.74492: Sent initial data (168 bytes) 15627 1726882486.76934: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882486.76942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882486.77084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882486.77087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882486.77089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882486.77158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882486.77281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882486.77495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882486.79229: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 15627 1726882486.79233: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882486.79312: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882486.79411: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpt_m1bpn3 /root/.ansible/tmp/ansible-tmp-1726882486.6928527-16727-182561179751028/AnsiballZ_network_connections.py <<< 15627 1726882486.79503: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882486.81577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882486.81671: stderr chunk (state=3): >>><<< 15627 1726882486.81674: stdout chunk (state=3): >>><<< 15627 1726882486.81696: done transferring module to remote 15627 1726882486.81707: _low_level_execute_command(): starting 15627 1726882486.81712: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882486.6928527-16727-182561179751028/ /root/.ansible/tmp/ansible-tmp-1726882486.6928527-16727-182561179751028/AnsiballZ_network_connections.py && sleep 0' 15627 1726882486.82639: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882486.82646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882486.82692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882486.82698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 15627 1726882486.82711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882486.82717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882486.82737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882486.82827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882486.82857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882486.82861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882486.82979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882486.84691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882486.84737: stderr chunk (state=3): >>><<< 15627 1726882486.84740: stdout chunk (state=3): >>><<< 15627 1726882486.84753: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882486.84758: _low_level_execute_command(): starting 15627 1726882486.84761: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882486.6928527-16727-182561179751028/AnsiballZ_network_connections.py && sleep 0' 15627 1726882486.85215: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882486.85368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882486.85372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882486.85447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882487.12999: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15627 1726882487.14612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882487.14616: stdout chunk (state=3): >>><<< 15627 1726882487.14619: stderr chunk (state=3): >>><<< 15627 1726882487.14775: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882487.14786: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882486.6928527-16727-182561179751028/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882487.14789: _low_level_execute_command(): starting 15627 1726882487.14792: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882486.6928527-16727-182561179751028/ > /dev/null 2>&1 && sleep 0' 15627 1726882487.16372: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882487.16481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882487.16497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882487.16521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882487.16567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882487.16625: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882487.16644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882487.16668: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882487.16682: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882487.16693: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882487.16706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882487.16725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882487.16742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882487.16757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882487.16775: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882487.16790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882487.16908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882487.17069: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882487.17086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882487.17297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882487.19262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882487.19268: stdout chunk (state=3): >>><<< 15627 1726882487.19270: stderr chunk (state=3): >>><<< 15627 1726882487.19469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882487.19472: handler run complete 15627 1726882487.19474: attempt loop complete, returning result 15627 1726882487.19476: _execute() done 15627 1726882487.19478: dumping result to json 15627 1726882487.19480: done dumping result, returning 15627 1726882487.19482: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-2847-7723-00000000004a] 15627 1726882487.19484: sending task result for task 0e448fcc-3ce9-2847-7723-00000000004a 15627 1726882487.19552: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000004a 15627 1726882487.19557: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15627 1726882487.19853: no more pending results, returning what we have 15627 1726882487.20168: results queue empty 15627 1726882487.20170: checking for any_errors_fatal 15627 1726882487.20177: done checking for any_errors_fatal 15627 1726882487.20178: checking for max_fail_percentage 15627 1726882487.20180: done checking for max_fail_percentage 15627 1726882487.20180: checking to see if all hosts have failed and the running result is not ok 15627 1726882487.20181: done checking to see if all hosts have failed 15627 1726882487.20182: getting the remaining hosts for this loop 15627 1726882487.20184: done getting the remaining hosts for this loop 15627 1726882487.20188: getting the next task for host managed_node1 15627 1726882487.20194: done getting next task for host managed_node1 15627 1726882487.20198: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15627 1726882487.20200: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882487.20209: getting variables 15627 1726882487.20210: in VariableManager get_vars() 15627 1726882487.20246: Calling all_inventory to load vars for managed_node1 15627 1726882487.20248: Calling groups_inventory to load vars for managed_node1 15627 1726882487.20250: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882487.20262: Calling all_plugins_play to load vars for managed_node1 15627 1726882487.20267: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882487.20270: Calling groups_plugins_play to load vars for managed_node1 15627 1726882487.23588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882487.25642: done with get_vars() 15627 1726882487.25696: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:34:47 -0400 (0:00:00.857) 0:00:27.009 ****** 15627 1726882487.25831: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15627 1726882487.26447: worker is 1 (out of 1 available) 15627 1726882487.26458: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15627 1726882487.26475: done queuing things up, now waiting for results queue to drain 15627 1726882487.26477: waiting for pending results... 15627 1726882487.26759: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 15627 1726882487.26853: in run() - task 0e448fcc-3ce9-2847-7723-00000000004b 15627 1726882487.26870: variable 'ansible_search_path' from source: unknown 15627 1726882487.26874: variable 'ansible_search_path' from source: unknown 15627 1726882487.26925: calling self._execute() 15627 1726882487.27017: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882487.27021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882487.27035: variable 'omit' from source: magic vars 15627 1726882487.27650: variable 'ansible_distribution_major_version' from source: facts 15627 1726882487.27663: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882487.27788: variable 'network_state' from source: role '' defaults 15627 1726882487.27800: Evaluated conditional (network_state != {}): False 15627 1726882487.27804: when evaluation is False, skipping this task 15627 1726882487.27807: _execute() done 15627 1726882487.27811: dumping result to json 15627 1726882487.27814: done dumping result, returning 15627 1726882487.27817: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-2847-7723-00000000004b] 15627 1726882487.27824: sending task result for task 0e448fcc-3ce9-2847-7723-00000000004b 15627 1726882487.27916: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000004b 15627 1726882487.27920: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882487.27974: no more pending results, returning what we have 15627 1726882487.27978: results queue empty 15627 1726882487.27979: checking for any_errors_fatal 15627 1726882487.27988: done checking for any_errors_fatal 15627 1726882487.27989: checking for max_fail_percentage 15627 1726882487.27991: done checking for max_fail_percentage 15627 1726882487.27992: checking to see if all hosts have failed and the running result is not ok 15627 1726882487.27993: done checking to see if all hosts have failed 15627 1726882487.27994: getting the remaining hosts for this loop 15627 1726882487.27996: done getting the remaining hosts for this loop 15627 1726882487.27999: getting the next task for host managed_node1 15627 1726882487.28007: done getting next task for host managed_node1 15627 1726882487.28011: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15627 1726882487.28014: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882487.28028: getting variables 15627 1726882487.28030: in VariableManager get_vars() 15627 1726882487.28070: Calling all_inventory to load vars for managed_node1 15627 1726882487.28073: Calling groups_inventory to load vars for managed_node1 15627 1726882487.28075: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882487.28087: Calling all_plugins_play to load vars for managed_node1 15627 1726882487.28091: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882487.28093: Calling groups_plugins_play to load vars for managed_node1 15627 1726882487.30635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882487.33418: done with get_vars() 15627 1726882487.33447: done getting variables 15627 1726882487.33510: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:34:47 -0400 (0:00:00.077) 0:00:27.087 ****** 15627 1726882487.33543: entering _queue_task() for managed_node1/debug 15627 1726882487.33860: worker is 1 (out of 1 available) 15627 1726882487.34034: exiting _queue_task() for managed_node1/debug 15627 1726882487.34047: done queuing things up, now waiting for results queue to drain 15627 1726882487.34048: waiting for pending results... 15627 1726882487.34404: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15627 1726882487.34491: in run() - task 0e448fcc-3ce9-2847-7723-00000000004c 15627 1726882487.34505: variable 'ansible_search_path' from source: unknown 15627 1726882487.34510: variable 'ansible_search_path' from source: unknown 15627 1726882487.34544: calling self._execute() 15627 1726882487.34753: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882487.34760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882487.34829: variable 'omit' from source: magic vars 15627 1726882487.35448: variable 'ansible_distribution_major_version' from source: facts 15627 1726882487.35460: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882487.35473: variable 'omit' from source: magic vars 15627 1726882487.35515: variable 'omit' from source: magic vars 15627 1726882487.35595: variable 'omit' from source: magic vars 15627 1726882487.35636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882487.35793: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882487.35815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882487.35836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882487.35846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882487.35991: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882487.35995: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882487.35998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882487.36222: Set connection var ansible_timeout to 10 15627 1726882487.36230: Set connection var ansible_shell_executable to /bin/sh 15627 1726882487.36235: Set connection var ansible_connection to ssh 15627 1726882487.36240: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882487.36246: Set connection var ansible_pipelining to False 15627 1726882487.36249: Set connection var ansible_shell_type to sh 15627 1726882487.36308: variable 'ansible_shell_executable' from source: unknown 15627 1726882487.36311: variable 'ansible_connection' from source: unknown 15627 1726882487.36314: variable 'ansible_module_compression' from source: unknown 15627 1726882487.36317: variable 'ansible_shell_type' from source: unknown 15627 1726882487.36319: variable 'ansible_shell_executable' from source: unknown 15627 1726882487.36321: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882487.36325: variable 'ansible_pipelining' from source: unknown 15627 1726882487.36328: variable 'ansible_timeout' from source: unknown 15627 1726882487.36333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882487.36600: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882487.36608: variable 'omit' from source: magic vars 15627 1726882487.36612: starting attempt loop 15627 1726882487.36615: running the handler 15627 1726882487.36748: variable '__network_connections_result' from source: set_fact 15627 1726882487.36821: handler run complete 15627 1726882487.36873: attempt loop complete, returning result 15627 1726882487.36876: _execute() done 15627 1726882487.36879: dumping result to json 15627 1726882487.36881: done dumping result, returning 15627 1726882487.36892: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-2847-7723-00000000004c] 15627 1726882487.36895: sending task result for task 0e448fcc-3ce9-2847-7723-00000000004c 15627 1726882487.36993: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000004c 15627 1726882487.36998: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 15627 1726882487.37066: no more pending results, returning what we have 15627 1726882487.37070: results queue empty 15627 1726882487.37071: checking for any_errors_fatal 15627 1726882487.37078: done checking for any_errors_fatal 15627 1726882487.37079: checking for max_fail_percentage 15627 1726882487.37081: done checking for max_fail_percentage 15627 1726882487.37082: checking to see if all hosts have failed and the running result is not ok 15627 1726882487.37083: done checking to see if all hosts have failed 15627 1726882487.37084: getting the remaining hosts for this loop 15627 1726882487.37086: done getting the remaining hosts for this loop 15627 1726882487.37091: getting the next task for host managed_node1 15627 1726882487.37098: done getting next task for host managed_node1 15627 1726882487.37102: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15627 1726882487.37105: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882487.37114: getting variables 15627 1726882487.37116: in VariableManager get_vars() 15627 1726882487.37157: Calling all_inventory to load vars for managed_node1 15627 1726882487.37160: Calling groups_inventory to load vars for managed_node1 15627 1726882487.37162: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882487.37176: Calling all_plugins_play to load vars for managed_node1 15627 1726882487.37179: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882487.37182: Calling groups_plugins_play to load vars for managed_node1 15627 1726882487.39115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882487.44946: done with get_vars() 15627 1726882487.44985: done getting variables 15627 1726882487.45120: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:34:47 -0400 (0:00:00.117) 0:00:27.204 ****** 15627 1726882487.45276: entering _queue_task() for managed_node1/debug 15627 1726882487.46039: worker is 1 (out of 1 available) 15627 1726882487.46051: exiting _queue_task() for managed_node1/debug 15627 1726882487.46068: done queuing things up, now waiting for results queue to drain 15627 1726882487.46070: waiting for pending results... 15627 1726882487.47156: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15627 1726882487.47277: in run() - task 0e448fcc-3ce9-2847-7723-00000000004d 15627 1726882487.47446: variable 'ansible_search_path' from source: unknown 15627 1726882487.47456: variable 'ansible_search_path' from source: unknown 15627 1726882487.47500: calling self._execute() 15627 1726882487.47719: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882487.47732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882487.47757: variable 'omit' from source: magic vars 15627 1726882487.48568: variable 'ansible_distribution_major_version' from source: facts 15627 1726882487.48644: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882487.48659: variable 'omit' from source: magic vars 15627 1726882487.48703: variable 'omit' from source: magic vars 15627 1726882487.48808: variable 'omit' from source: magic vars 15627 1726882487.48892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882487.49095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882487.49121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882487.49143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882487.49170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882487.49317: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882487.49358: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882487.49371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882487.49488: Set connection var ansible_timeout to 10 15627 1726882487.49583: Set connection var ansible_shell_executable to /bin/sh 15627 1726882487.49620: Set connection var ansible_connection to ssh 15627 1726882487.49631: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882487.49729: Set connection var ansible_pipelining to False 15627 1726882487.49738: Set connection var ansible_shell_type to sh 15627 1726882487.49773: variable 'ansible_shell_executable' from source: unknown 15627 1726882487.49782: variable 'ansible_connection' from source: unknown 15627 1726882487.49790: variable 'ansible_module_compression' from source: unknown 15627 1726882487.49797: variable 'ansible_shell_type' from source: unknown 15627 1726882487.49804: variable 'ansible_shell_executable' from source: unknown 15627 1726882487.49811: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882487.49823: variable 'ansible_pipelining' from source: unknown 15627 1726882487.49837: variable 'ansible_timeout' from source: unknown 15627 1726882487.49878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882487.50185: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882487.50203: variable 'omit' from source: magic vars 15627 1726882487.50276: starting attempt loop 15627 1726882487.50283: running the handler 15627 1726882487.50335: variable '__network_connections_result' from source: set_fact 15627 1726882487.50550: variable '__network_connections_result' from source: set_fact 15627 1726882487.50794: handler run complete 15627 1726882487.50845: attempt loop complete, returning result 15627 1726882487.50921: _execute() done 15627 1726882487.50931: dumping result to json 15627 1726882487.50938: done dumping result, returning 15627 1726882487.50948: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-2847-7723-00000000004d] 15627 1726882487.50959: sending task result for task 0e448fcc-3ce9-2847-7723-00000000004d ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15627 1726882487.51136: no more pending results, returning what we have 15627 1726882487.51140: results queue empty 15627 1726882487.51141: checking for any_errors_fatal 15627 1726882487.51148: done checking for any_errors_fatal 15627 1726882487.51149: checking for max_fail_percentage 15627 1726882487.51151: done checking for max_fail_percentage 15627 1726882487.51151: checking to see if all hosts have failed and the running result is not ok 15627 1726882487.51153: done checking to see if all hosts have failed 15627 1726882487.51156: getting the remaining hosts for this loop 15627 1726882487.51158: done getting the remaining hosts for this loop 15627 1726882487.51162: getting the next task for host managed_node1 15627 1726882487.51178: done getting next task for host managed_node1 15627 1726882487.51183: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15627 1726882487.51185: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882487.51195: getting variables 15627 1726882487.51197: in VariableManager get_vars() 15627 1726882487.51235: Calling all_inventory to load vars for managed_node1 15627 1726882487.51238: Calling groups_inventory to load vars for managed_node1 15627 1726882487.51241: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882487.51251: Calling all_plugins_play to load vars for managed_node1 15627 1726882487.51257: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882487.51260: Calling groups_plugins_play to load vars for managed_node1 15627 1726882487.52375: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000004d 15627 1726882487.52379: WORKER PROCESS EXITING 15627 1726882487.53292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882487.55923: done with get_vars() 15627 1726882487.55948: done getting variables 15627 1726882487.56357: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:34:47 -0400 (0:00:00.113) 0:00:27.318 ****** 15627 1726882487.56713: entering _queue_task() for managed_node1/debug 15627 1726882487.58363: worker is 1 (out of 1 available) 15627 1726882487.58378: exiting _queue_task() for managed_node1/debug 15627 1726882487.58390: done queuing things up, now waiting for results queue to drain 15627 1726882487.58391: waiting for pending results... 15627 1726882487.58837: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15627 1726882487.58974: in run() - task 0e448fcc-3ce9-2847-7723-00000000004e 15627 1726882487.58988: variable 'ansible_search_path' from source: unknown 15627 1726882487.58992: variable 'ansible_search_path' from source: unknown 15627 1726882487.59025: calling self._execute() 15627 1726882487.59152: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882487.59158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882487.59171: variable 'omit' from source: magic vars 15627 1726882487.59640: variable 'ansible_distribution_major_version' from source: facts 15627 1726882487.59653: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882487.59780: variable 'network_state' from source: role '' defaults 15627 1726882487.59794: Evaluated conditional (network_state != {}): False 15627 1726882487.59797: when evaluation is False, skipping this task 15627 1726882487.59799: _execute() done 15627 1726882487.59802: dumping result to json 15627 1726882487.59804: done dumping result, returning 15627 1726882487.59810: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-2847-7723-00000000004e] 15627 1726882487.59818: sending task result for task 0e448fcc-3ce9-2847-7723-00000000004e 15627 1726882487.59919: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000004e 15627 1726882487.59922: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 15627 1726882487.59979: no more pending results, returning what we have 15627 1726882487.59984: results queue empty 15627 1726882487.59984: checking for any_errors_fatal 15627 1726882487.59995: done checking for any_errors_fatal 15627 1726882487.59995: checking for max_fail_percentage 15627 1726882487.59997: done checking for max_fail_percentage 15627 1726882487.59998: checking to see if all hosts have failed and the running result is not ok 15627 1726882487.59999: done checking to see if all hosts have failed 15627 1726882487.60000: getting the remaining hosts for this loop 15627 1726882487.60002: done getting the remaining hosts for this loop 15627 1726882487.60006: getting the next task for host managed_node1 15627 1726882487.60014: done getting next task for host managed_node1 15627 1726882487.60018: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15627 1726882487.60021: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882487.60034: getting variables 15627 1726882487.60036: in VariableManager get_vars() 15627 1726882487.60076: Calling all_inventory to load vars for managed_node1 15627 1726882487.60079: Calling groups_inventory to load vars for managed_node1 15627 1726882487.60082: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882487.60095: Calling all_plugins_play to load vars for managed_node1 15627 1726882487.60098: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882487.60102: Calling groups_plugins_play to load vars for managed_node1 15627 1726882487.68900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882487.73469: done with get_vars() 15627 1726882487.73525: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:34:47 -0400 (0:00:00.169) 0:00:27.487 ****** 15627 1726882487.73628: entering _queue_task() for managed_node1/ping 15627 1726882487.74120: worker is 1 (out of 1 available) 15627 1726882487.74131: exiting _queue_task() for managed_node1/ping 15627 1726882487.74142: done queuing things up, now waiting for results queue to drain 15627 1726882487.74143: waiting for pending results... 15627 1726882487.74575: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15627 1726882487.74704: in run() - task 0e448fcc-3ce9-2847-7723-00000000004f 15627 1726882487.74718: variable 'ansible_search_path' from source: unknown 15627 1726882487.74722: variable 'ansible_search_path' from source: unknown 15627 1726882487.74851: calling self._execute() 15627 1726882487.75168: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882487.75173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882487.75183: variable 'omit' from source: magic vars 15627 1726882487.75551: variable 'ansible_distribution_major_version' from source: facts 15627 1726882487.75562: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882487.75571: variable 'omit' from source: magic vars 15627 1726882487.75644: variable 'omit' from source: magic vars 15627 1726882487.75679: variable 'omit' from source: magic vars 15627 1726882487.75718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882487.75752: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882487.75799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882487.75817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882487.75829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882487.75858: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882487.75863: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882487.75868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882487.75994: Set connection var ansible_timeout to 10 15627 1726882487.76002: Set connection var ansible_shell_executable to /bin/sh 15627 1726882487.76008: Set connection var ansible_connection to ssh 15627 1726882487.76019: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882487.76024: Set connection var ansible_pipelining to False 15627 1726882487.76027: Set connection var ansible_shell_type to sh 15627 1726882487.76051: variable 'ansible_shell_executable' from source: unknown 15627 1726882487.76059: variable 'ansible_connection' from source: unknown 15627 1726882487.76062: variable 'ansible_module_compression' from source: unknown 15627 1726882487.76065: variable 'ansible_shell_type' from source: unknown 15627 1726882487.76084: variable 'ansible_shell_executable' from source: unknown 15627 1726882487.76087: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882487.76092: variable 'ansible_pipelining' from source: unknown 15627 1726882487.76095: variable 'ansible_timeout' from source: unknown 15627 1726882487.76098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882487.76305: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882487.76315: variable 'omit' from source: magic vars 15627 1726882487.76319: starting attempt loop 15627 1726882487.76323: running the handler 15627 1726882487.76335: _low_level_execute_command(): starting 15627 1726882487.76349: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882487.77219: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882487.77233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882487.77248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882487.77266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882487.77305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882487.77313: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882487.77323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882487.77336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882487.77344: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882487.77357: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882487.77365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882487.77380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882487.77391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882487.77399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882487.77406: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882487.77416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882487.77531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882487.77551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882487.77565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882487.77691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882487.79374: stdout chunk (state=3): >>>/root <<< 15627 1726882487.79516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882487.79519: stdout chunk (state=3): >>><<< 15627 1726882487.79529: stderr chunk (state=3): >>><<< 15627 1726882487.79558: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882487.79569: _low_level_execute_command(): starting 15627 1726882487.79575: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882487.7955148-16784-183351823809673 `" && echo ansible-tmp-1726882487.7955148-16784-183351823809673="` echo /root/.ansible/tmp/ansible-tmp-1726882487.7955148-16784-183351823809673 `" ) && sleep 0' 15627 1726882487.80238: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882487.80246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882487.80260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882487.80273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882487.80310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882487.80316: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882487.80327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882487.80337: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882487.80345: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882487.80352: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882487.80358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882487.80372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882487.80384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882487.80391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882487.80397: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882487.80406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882487.80479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882487.80492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882487.80502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882487.80624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882487.82483: stdout chunk (state=3): >>>ansible-tmp-1726882487.7955148-16784-183351823809673=/root/.ansible/tmp/ansible-tmp-1726882487.7955148-16784-183351823809673 <<< 15627 1726882487.82681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882487.82685: stdout chunk (state=3): >>><<< 15627 1726882487.82691: stderr chunk (state=3): >>><<< 15627 1726882487.83018: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882487.7955148-16784-183351823809673=/root/.ansible/tmp/ansible-tmp-1726882487.7955148-16784-183351823809673 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882487.83022: variable 'ansible_module_compression' from source: unknown 15627 1726882487.83026: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15627 1726882487.83029: variable 'ansible_facts' from source: unknown 15627 1726882487.83032: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882487.7955148-16784-183351823809673/AnsiballZ_ping.py 15627 1726882487.83097: Sending initial data 15627 1726882487.83107: Sent initial data (153 bytes) 15627 1726882487.84474: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882487.84478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882487.84510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882487.84514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882487.84517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882487.84597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882487.84604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882487.84607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882487.84703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882487.86494: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882487.86677: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882487.86775: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmp1rzz45k5 /root/.ansible/tmp/ansible-tmp-1726882487.7955148-16784-183351823809673/AnsiballZ_ping.py <<< 15627 1726882487.86949: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882487.89329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882487.89411: stderr chunk (state=3): >>><<< 15627 1726882487.89415: stdout chunk (state=3): >>><<< 15627 1726882487.89436: done transferring module to remote 15627 1726882487.89446: _low_level_execute_command(): starting 15627 1726882487.89452: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882487.7955148-16784-183351823809673/ /root/.ansible/tmp/ansible-tmp-1726882487.7955148-16784-183351823809673/AnsiballZ_ping.py && sleep 0' 15627 1726882487.93197: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882487.93212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882487.93224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882487.93238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882487.93288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882487.93296: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882487.93304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882487.93319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882487.93326: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882487.93332: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882487.93343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882487.93355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882487.93370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882487.93380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882487.93384: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882487.93395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882487.93481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882487.93498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882487.93505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882487.93631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882487.95495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882487.95499: stdout chunk (state=3): >>><<< 15627 1726882487.95501: stderr chunk (state=3): >>><<< 15627 1726882487.95586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882487.95589: _low_level_execute_command(): starting 15627 1726882487.95591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882487.7955148-16784-183351823809673/AnsiballZ_ping.py && sleep 0' 15627 1726882487.96717: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882487.96721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882487.96751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882487.96755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882487.96757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882487.96834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882487.96847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882487.96978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882488.09923: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15627 1726882488.11051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882488.11055: stdout chunk (state=3): >>><<< 15627 1726882488.11057: stderr chunk (state=3): >>><<< 15627 1726882488.11183: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882488.11188: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882487.7955148-16784-183351823809673/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882488.11190: _low_level_execute_command(): starting 15627 1726882488.11192: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882487.7955148-16784-183351823809673/ > /dev/null 2>&1 && sleep 0' 15627 1726882488.11732: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882488.11746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882488.11762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882488.11792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882488.11832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882488.11849: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882488.11866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882488.11886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882488.11898: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882488.11910: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882488.11922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882488.11938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882488.11960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882488.11976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882488.11988: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882488.12006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882488.12117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882488.12328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882488.12347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882488.12479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882488.14357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882488.14361: stdout chunk (state=3): >>><<< 15627 1726882488.14365: stderr chunk (state=3): >>><<< 15627 1726882488.14572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882488.14575: handler run complete 15627 1726882488.14578: attempt loop complete, returning result 15627 1726882488.14580: _execute() done 15627 1726882488.14582: dumping result to json 15627 1726882488.14584: done dumping result, returning 15627 1726882488.14587: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-2847-7723-00000000004f] 15627 1726882488.14589: sending task result for task 0e448fcc-3ce9-2847-7723-00000000004f 15627 1726882488.14669: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000004f 15627 1726882488.14672: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 15627 1726882488.14728: no more pending results, returning what we have 15627 1726882488.14731: results queue empty 15627 1726882488.14732: checking for any_errors_fatal 15627 1726882488.14739: done checking for any_errors_fatal 15627 1726882488.14740: checking for max_fail_percentage 15627 1726882488.14741: done checking for max_fail_percentage 15627 1726882488.14742: checking to see if all hosts have failed and the running result is not ok 15627 1726882488.14743: done checking to see if all hosts have failed 15627 1726882488.14744: getting the remaining hosts for this loop 15627 1726882488.14745: done getting the remaining hosts for this loop 15627 1726882488.14749: getting the next task for host managed_node1 15627 1726882488.14757: done getting next task for host managed_node1 15627 1726882488.14759: ^ task is: TASK: meta (role_complete) 15627 1726882488.14760: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882488.14772: getting variables 15627 1726882488.14774: in VariableManager get_vars() 15627 1726882488.14809: Calling all_inventory to load vars for managed_node1 15627 1726882488.14811: Calling groups_inventory to load vars for managed_node1 15627 1726882488.14814: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882488.14823: Calling all_plugins_play to load vars for managed_node1 15627 1726882488.14825: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882488.14827: Calling groups_plugins_play to load vars for managed_node1 15627 1726882488.18572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882488.23478: done with get_vars() 15627 1726882488.23616: done getting variables 15627 1726882488.23702: done queuing things up, now waiting for results queue to drain 15627 1726882488.23704: results queue empty 15627 1726882488.23705: checking for any_errors_fatal 15627 1726882488.23708: done checking for any_errors_fatal 15627 1726882488.23709: checking for max_fail_percentage 15627 1726882488.23710: done checking for max_fail_percentage 15627 1726882488.23710: checking to see if all hosts have failed and the running result is not ok 15627 1726882488.23711: done checking to see if all hosts have failed 15627 1726882488.23712: getting the remaining hosts for this loop 15627 1726882488.23713: done getting the remaining hosts for this loop 15627 1726882488.23715: getting the next task for host managed_node1 15627 1726882488.23835: done getting next task for host managed_node1 15627 1726882488.23837: ^ task is: TASK: meta (flush_handlers) 15627 1726882488.23839: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882488.23843: getting variables 15627 1726882488.23844: in VariableManager get_vars() 15627 1726882488.23857: Calling all_inventory to load vars for managed_node1 15627 1726882488.23859: Calling groups_inventory to load vars for managed_node1 15627 1726882488.23862: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882488.23869: Calling all_plugins_play to load vars for managed_node1 15627 1726882488.23872: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882488.23875: Calling groups_plugins_play to load vars for managed_node1 15627 1726882488.26409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882488.30227: done with get_vars() 15627 1726882488.30381: done getting variables 15627 1726882488.30435: in VariableManager get_vars() 15627 1726882488.30449: Calling all_inventory to load vars for managed_node1 15627 1726882488.30570: Calling groups_inventory to load vars for managed_node1 15627 1726882488.30578: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882488.30584: Calling all_plugins_play to load vars for managed_node1 15627 1726882488.30587: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882488.30589: Calling groups_plugins_play to load vars for managed_node1 15627 1726882488.35432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882488.37607: done with get_vars() 15627 1726882488.37636: done queuing things up, now waiting for results queue to drain 15627 1726882488.37638: results queue empty 15627 1726882488.37639: checking for any_errors_fatal 15627 1726882488.37641: done checking for any_errors_fatal 15627 1726882488.37641: checking for max_fail_percentage 15627 1726882488.37643: done checking for max_fail_percentage 15627 1726882488.37643: checking to see if all hosts have failed and the running result is not ok 15627 1726882488.37644: done checking to see if all hosts have failed 15627 1726882488.37645: getting the remaining hosts for this loop 15627 1726882488.37646: done getting the remaining hosts for this loop 15627 1726882488.37649: getting the next task for host managed_node1 15627 1726882488.37653: done getting next task for host managed_node1 15627 1726882488.37658: ^ task is: TASK: meta (flush_handlers) 15627 1726882488.37659: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882488.37668: getting variables 15627 1726882488.37669: in VariableManager get_vars() 15627 1726882488.37682: Calling all_inventory to load vars for managed_node1 15627 1726882488.37685: Calling groups_inventory to load vars for managed_node1 15627 1726882488.37687: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882488.37692: Calling all_plugins_play to load vars for managed_node1 15627 1726882488.37695: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882488.37698: Calling groups_plugins_play to load vars for managed_node1 15627 1726882488.42066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882488.46887: done with get_vars() 15627 1726882488.46923: done getting variables 15627 1726882488.46999: in VariableManager get_vars() 15627 1726882488.47018: Calling all_inventory to load vars for managed_node1 15627 1726882488.47020: Calling groups_inventory to load vars for managed_node1 15627 1726882488.47022: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882488.47028: Calling all_plugins_play to load vars for managed_node1 15627 1726882488.47030: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882488.47037: Calling groups_plugins_play to load vars for managed_node1 15627 1726882488.49340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882488.50431: done with get_vars() 15627 1726882488.50449: done queuing things up, now waiting for results queue to drain 15627 1726882488.50451: results queue empty 15627 1726882488.50452: checking for any_errors_fatal 15627 1726882488.50453: done checking for any_errors_fatal 15627 1726882488.50453: checking for max_fail_percentage 15627 1726882488.50456: done checking for max_fail_percentage 15627 1726882488.50456: checking to see if all hosts have failed and the running result is not ok 15627 1726882488.50457: done checking to see if all hosts have failed 15627 1726882488.50457: getting the remaining hosts for this loop 15627 1726882488.50458: done getting the remaining hosts for this loop 15627 1726882488.50460: getting the next task for host managed_node1 15627 1726882488.50462: done getting next task for host managed_node1 15627 1726882488.50464: ^ task is: None 15627 1726882488.50466: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882488.50467: done queuing things up, now waiting for results queue to drain 15627 1726882488.50467: results queue empty 15627 1726882488.50468: checking for any_errors_fatal 15627 1726882488.50468: done checking for any_errors_fatal 15627 1726882488.50469: checking for max_fail_percentage 15627 1726882488.50469: done checking for max_fail_percentage 15627 1726882488.50470: checking to see if all hosts have failed and the running result is not ok 15627 1726882488.50470: done checking to see if all hosts have failed 15627 1726882488.50471: getting the next task for host managed_node1 15627 1726882488.50472: done getting next task for host managed_node1 15627 1726882488.50473: ^ task is: None 15627 1726882488.50473: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882488.50508: in VariableManager get_vars() 15627 1726882488.50520: done with get_vars() 15627 1726882488.50527: in VariableManager get_vars() 15627 1726882488.50534: done with get_vars() 15627 1726882488.50537: variable 'omit' from source: magic vars 15627 1726882488.50559: in VariableManager get_vars() 15627 1726882488.50569: done with get_vars() 15627 1726882488.50585: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 15627 1726882488.50707: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15627 1726882488.50726: getting the remaining hosts for this loop 15627 1726882488.50727: done getting the remaining hosts for this loop 15627 1726882488.50728: getting the next task for host managed_node1 15627 1726882488.50730: done getting next task for host managed_node1 15627 1726882488.50731: ^ task is: TASK: Gathering Facts 15627 1726882488.50732: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882488.50733: getting variables 15627 1726882488.50734: in VariableManager get_vars() 15627 1726882488.50743: Calling all_inventory to load vars for managed_node1 15627 1726882488.50745: Calling groups_inventory to load vars for managed_node1 15627 1726882488.50746: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882488.50755: Calling all_plugins_play to load vars for managed_node1 15627 1726882488.50758: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882488.50761: Calling groups_plugins_play to load vars for managed_node1 15627 1726882488.51536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882488.53106: done with get_vars() 15627 1726882488.53134: done getting variables 15627 1726882488.53249: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 21:34:48 -0400 (0:00:00.796) 0:00:28.284 ****** 15627 1726882488.53316: entering _queue_task() for managed_node1/gather_facts 15627 1726882488.53947: worker is 1 (out of 1 available) 15627 1726882488.53962: exiting _queue_task() for managed_node1/gather_facts 15627 1726882488.53975: done queuing things up, now waiting for results queue to drain 15627 1726882488.53976: waiting for pending results... 15627 1726882488.54743: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15627 1726882488.54865: in run() - task 0e448fcc-3ce9-2847-7723-000000000382 15627 1726882488.54898: variable 'ansible_search_path' from source: unknown 15627 1726882488.54927: calling self._execute() 15627 1726882488.55035: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882488.55041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882488.55049: variable 'omit' from source: magic vars 15627 1726882488.55339: variable 'ansible_distribution_major_version' from source: facts 15627 1726882488.55349: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882488.55353: variable 'omit' from source: magic vars 15627 1726882488.55379: variable 'omit' from source: magic vars 15627 1726882488.55403: variable 'omit' from source: magic vars 15627 1726882488.55432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882488.55461: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882488.55483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882488.55508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882488.55512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882488.55548: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882488.55555: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882488.55565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882488.55634: Set connection var ansible_timeout to 10 15627 1726882488.55641: Set connection var ansible_shell_executable to /bin/sh 15627 1726882488.55645: Set connection var ansible_connection to ssh 15627 1726882488.55651: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882488.55659: Set connection var ansible_pipelining to False 15627 1726882488.55666: Set connection var ansible_shell_type to sh 15627 1726882488.55683: variable 'ansible_shell_executable' from source: unknown 15627 1726882488.55686: variable 'ansible_connection' from source: unknown 15627 1726882488.55688: variable 'ansible_module_compression' from source: unknown 15627 1726882488.55691: variable 'ansible_shell_type' from source: unknown 15627 1726882488.55693: variable 'ansible_shell_executable' from source: unknown 15627 1726882488.55699: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882488.55702: variable 'ansible_pipelining' from source: unknown 15627 1726882488.55705: variable 'ansible_timeout' from source: unknown 15627 1726882488.55713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882488.55842: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882488.55851: variable 'omit' from source: magic vars 15627 1726882488.55859: starting attempt loop 15627 1726882488.55861: running the handler 15627 1726882488.55877: variable 'ansible_facts' from source: unknown 15627 1726882488.55893: _low_level_execute_command(): starting 15627 1726882488.55900: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882488.56436: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882488.56456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882488.56471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882488.56489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882488.56501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882488.56546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882488.56551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882488.56567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882488.56675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882488.58338: stdout chunk (state=3): >>>/root <<< 15627 1726882488.58437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882488.58483: stderr chunk (state=3): >>><<< 15627 1726882488.58486: stdout chunk (state=3): >>><<< 15627 1726882488.58508: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882488.58517: _low_level_execute_command(): starting 15627 1726882488.58523: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882488.5850616-16826-242247033288640 `" && echo ansible-tmp-1726882488.5850616-16826-242247033288640="` echo /root/.ansible/tmp/ansible-tmp-1726882488.5850616-16826-242247033288640 `" ) && sleep 0' 15627 1726882488.58946: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882488.58962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882488.58977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882488.59005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882488.59041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882488.59054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882488.59158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882488.61029: stdout chunk (state=3): >>>ansible-tmp-1726882488.5850616-16826-242247033288640=/root/.ansible/tmp/ansible-tmp-1726882488.5850616-16826-242247033288640 <<< 15627 1726882488.61141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882488.61199: stderr chunk (state=3): >>><<< 15627 1726882488.61209: stdout chunk (state=3): >>><<< 15627 1726882488.61225: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882488.5850616-16826-242247033288640=/root/.ansible/tmp/ansible-tmp-1726882488.5850616-16826-242247033288640 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882488.61257: variable 'ansible_module_compression' from source: unknown 15627 1726882488.61319: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15627 1726882488.61373: variable 'ansible_facts' from source: unknown 15627 1726882488.61502: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882488.5850616-16826-242247033288640/AnsiballZ_setup.py 15627 1726882488.61637: Sending initial data 15627 1726882488.61647: Sent initial data (154 bytes) 15627 1726882488.62520: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882488.62524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882488.62554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882488.62557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882488.62560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882488.62631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882488.62636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882488.62730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882488.64687: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882488.64779: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882488.64851: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmp30i0d04k /root/.ansible/tmp/ansible-tmp-1726882488.5850616-16826-242247033288640/AnsiballZ_setup.py <<< 15627 1726882488.67513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882488.67658: stderr chunk (state=3): >>><<< 15627 1726882488.67670: stdout chunk (state=3): >>><<< 15627 1726882488.67761: done transferring module to remote 15627 1726882488.67768: _low_level_execute_command(): starting 15627 1726882488.67771: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882488.5850616-16826-242247033288640/ /root/.ansible/tmp/ansible-tmp-1726882488.5850616-16826-242247033288640/AnsiballZ_setup.py && sleep 0' 15627 1726882488.68291: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882488.68304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882488.68317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882488.68333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882488.68375: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882488.68387: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882488.68404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882488.68422: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882488.68433: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882488.68444: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882488.68456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882488.68473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882488.68489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882488.68505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882488.68518: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882488.68532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882488.68607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882488.68627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882488.68635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882488.68732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882488.70433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882488.70487: stderr chunk (state=3): >>><<< 15627 1726882488.70494: stdout chunk (state=3): >>><<< 15627 1726882488.70553: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882488.70558: _low_level_execute_command(): starting 15627 1726882488.70561: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882488.5850616-16826-242247033288640/AnsiballZ_setup.py && sleep 0' 15627 1726882488.70914: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882488.70934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882488.70937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882488.70939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882488.70976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882488.70979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882488.70982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882488.71037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882488.71041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882488.71145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882489.22093: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3276, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ans<<< 15627 1726882489.22112: stdout chunk (state=3): >>>ible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 646, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241344512, "block_size": 4096, "block_total": 65519355, "block_available": 64512047, "block_used": 1007308, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansibl<<< 15627 1726882489.22150: stdout chunk (state=3): >>>e_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.5, "5m": 0.38, "15m": 0.2}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "49", "epoch": "1726882489", "epoch_int": "1726882489", "date": "2024-09-20", "time": "21:34:49", "iso8601_micro": "2024-09-21T01:34:49.181655Z", "iso8601": "2024-09-21T01:34:49Z", "iso8601_basic": "20240920T213449181655", "iso8601_basic_short": "20240920T213449", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation"<<< 15627 1726882489.22188: stdout chunk (state=3): >>>: "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15627 1726882489.23838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882489.23841: stdout chunk (state=3): >>><<< 15627 1726882489.23844: stderr chunk (state=3): >>><<< 15627 1726882489.24174: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3276, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 646, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241344512, "block_size": 4096, "block_total": 65519355, "block_available": 64512047, "block_used": 1007308, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.5, "5m": 0.38, "15m": 0.2}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "49", "epoch": "1726882489", "epoch_int": "1726882489", "date": "2024-09-20", "time": "21:34:49", "iso8601_micro": "2024-09-21T01:34:49.181655Z", "iso8601": "2024-09-21T01:34:49Z", "iso8601_basic": "20240920T213449181655", "iso8601_basic_short": "20240920T213449", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882489.24256: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882488.5850616-16826-242247033288640/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882489.24291: _low_level_execute_command(): starting 15627 1726882489.24302: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882488.5850616-16826-242247033288640/ > /dev/null 2>&1 && sleep 0' 15627 1726882489.24991: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882489.25006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.25022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.25049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.25095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.25109: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882489.25124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.25147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882489.25166: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882489.25180: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882489.25194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.25209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.25225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.25237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.25251: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882489.25271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.25349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882489.25379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882489.25398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882489.25525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882489.27348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882489.27428: stderr chunk (state=3): >>><<< 15627 1726882489.27431: stdout chunk (state=3): >>><<< 15627 1726882489.28069: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882489.28072: handler run complete 15627 1726882489.28075: variable 'ansible_facts' from source: unknown 15627 1726882489.28077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882489.28079: variable 'ansible_facts' from source: unknown 15627 1726882489.28081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882489.28179: attempt loop complete, returning result 15627 1726882489.28189: _execute() done 15627 1726882489.28196: dumping result to json 15627 1726882489.28229: done dumping result, returning 15627 1726882489.28241: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-2847-7723-000000000382] 15627 1726882489.28251: sending task result for task 0e448fcc-3ce9-2847-7723-000000000382 ok: [managed_node1] 15627 1726882489.28868: no more pending results, returning what we have 15627 1726882489.28871: results queue empty 15627 1726882489.28872: checking for any_errors_fatal 15627 1726882489.28873: done checking for any_errors_fatal 15627 1726882489.28874: checking for max_fail_percentage 15627 1726882489.28875: done checking for max_fail_percentage 15627 1726882489.28876: checking to see if all hosts have failed and the running result is not ok 15627 1726882489.28877: done checking to see if all hosts have failed 15627 1726882489.28878: getting the remaining hosts for this loop 15627 1726882489.28879: done getting the remaining hosts for this loop 15627 1726882489.28882: getting the next task for host managed_node1 15627 1726882489.28887: done getting next task for host managed_node1 15627 1726882489.28889: ^ task is: TASK: meta (flush_handlers) 15627 1726882489.28891: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882489.28894: getting variables 15627 1726882489.28895: in VariableManager get_vars() 15627 1726882489.28915: Calling all_inventory to load vars for managed_node1 15627 1726882489.28918: Calling groups_inventory to load vars for managed_node1 15627 1726882489.28921: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882489.28931: Calling all_plugins_play to load vars for managed_node1 15627 1726882489.28934: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882489.28946: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000382 15627 1726882489.28949: WORKER PROCESS EXITING 15627 1726882489.28957: Calling groups_plugins_play to load vars for managed_node1 15627 1726882489.30537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882489.32442: done with get_vars() 15627 1726882489.32470: done getting variables 15627 1726882489.32767: in VariableManager get_vars() 15627 1726882489.32778: Calling all_inventory to load vars for managed_node1 15627 1726882489.32780: Calling groups_inventory to load vars for managed_node1 15627 1726882489.32782: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882489.32787: Calling all_plugins_play to load vars for managed_node1 15627 1726882489.32789: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882489.32797: Calling groups_plugins_play to load vars for managed_node1 15627 1726882489.34084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882489.35772: done with get_vars() 15627 1726882489.35798: done queuing things up, now waiting for results queue to drain 15627 1726882489.35800: results queue empty 15627 1726882489.35801: checking for any_errors_fatal 15627 1726882489.35805: done checking for any_errors_fatal 15627 1726882489.35806: checking for max_fail_percentage 15627 1726882489.35807: done checking for max_fail_percentage 15627 1726882489.35807: checking to see if all hosts have failed and the running result is not ok 15627 1726882489.35808: done checking to see if all hosts have failed 15627 1726882489.35809: getting the remaining hosts for this loop 15627 1726882489.35810: done getting the remaining hosts for this loop 15627 1726882489.35813: getting the next task for host managed_node1 15627 1726882489.35817: done getting next task for host managed_node1 15627 1726882489.35820: ^ task is: TASK: Include the task 'delete_interface.yml' 15627 1726882489.35821: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882489.35823: getting variables 15627 1726882489.35824: in VariableManager get_vars() 15627 1726882489.35833: Calling all_inventory to load vars for managed_node1 15627 1726882489.35835: Calling groups_inventory to load vars for managed_node1 15627 1726882489.35838: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882489.35843: Calling all_plugins_play to load vars for managed_node1 15627 1726882489.35845: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882489.35848: Calling groups_plugins_play to load vars for managed_node1 15627 1726882489.37051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882489.38731: done with get_vars() 15627 1726882489.38748: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 21:34:49 -0400 (0:00:00.855) 0:00:29.139 ****** 15627 1726882489.38814: entering _queue_task() for managed_node1/include_tasks 15627 1726882489.39190: worker is 1 (out of 1 available) 15627 1726882489.39204: exiting _queue_task() for managed_node1/include_tasks 15627 1726882489.39217: done queuing things up, now waiting for results queue to drain 15627 1726882489.39218: waiting for pending results... 15627 1726882489.39510: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 15627 1726882489.39634: in run() - task 0e448fcc-3ce9-2847-7723-000000000052 15627 1726882489.39658: variable 'ansible_search_path' from source: unknown 15627 1726882489.39706: calling self._execute() 15627 1726882489.39798: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882489.39810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882489.39825: variable 'omit' from source: magic vars 15627 1726882489.40208: variable 'ansible_distribution_major_version' from source: facts 15627 1726882489.40227: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882489.40239: _execute() done 15627 1726882489.40246: dumping result to json 15627 1726882489.40254: done dumping result, returning 15627 1726882489.40268: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [0e448fcc-3ce9-2847-7723-000000000052] 15627 1726882489.40280: sending task result for task 0e448fcc-3ce9-2847-7723-000000000052 15627 1726882489.40408: no more pending results, returning what we have 15627 1726882489.40414: in VariableManager get_vars() 15627 1726882489.40448: Calling all_inventory to load vars for managed_node1 15627 1726882489.40452: Calling groups_inventory to load vars for managed_node1 15627 1726882489.40456: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882489.40471: Calling all_plugins_play to load vars for managed_node1 15627 1726882489.40475: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882489.40479: Calling groups_plugins_play to load vars for managed_node1 15627 1726882489.41583: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000052 15627 1726882489.41586: WORKER PROCESS EXITING 15627 1726882489.42136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882489.43811: done with get_vars() 15627 1726882489.43829: variable 'ansible_search_path' from source: unknown 15627 1726882489.43842: we have included files to process 15627 1726882489.43843: generating all_blocks data 15627 1726882489.43844: done generating all_blocks data 15627 1726882489.43845: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15627 1726882489.43846: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15627 1726882489.43848: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15627 1726882489.44072: done processing included file 15627 1726882489.44074: iterating over new_blocks loaded from include file 15627 1726882489.44075: in VariableManager get_vars() 15627 1726882489.44087: done with get_vars() 15627 1726882489.44089: filtering new block on tags 15627 1726882489.44105: done filtering new block on tags 15627 1726882489.44107: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 15627 1726882489.44111: extending task lists for all hosts with included blocks 15627 1726882489.44141: done extending task lists 15627 1726882489.44142: done processing included files 15627 1726882489.44143: results queue empty 15627 1726882489.44144: checking for any_errors_fatal 15627 1726882489.44145: done checking for any_errors_fatal 15627 1726882489.44146: checking for max_fail_percentage 15627 1726882489.44147: done checking for max_fail_percentage 15627 1726882489.44148: checking to see if all hosts have failed and the running result is not ok 15627 1726882489.44148: done checking to see if all hosts have failed 15627 1726882489.44149: getting the remaining hosts for this loop 15627 1726882489.44150: done getting the remaining hosts for this loop 15627 1726882489.44153: getting the next task for host managed_node1 15627 1726882489.44156: done getting next task for host managed_node1 15627 1726882489.44159: ^ task is: TASK: Remove test interface if necessary 15627 1726882489.44161: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882489.44165: getting variables 15627 1726882489.44166: in VariableManager get_vars() 15627 1726882489.44174: Calling all_inventory to load vars for managed_node1 15627 1726882489.44176: Calling groups_inventory to load vars for managed_node1 15627 1726882489.44178: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882489.44182: Calling all_plugins_play to load vars for managed_node1 15627 1726882489.44185: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882489.44188: Calling groups_plugins_play to load vars for managed_node1 15627 1726882489.45426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882489.47074: done with get_vars() 15627 1726882489.47095: done getting variables 15627 1726882489.47138: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:34:49 -0400 (0:00:00.083) 0:00:29.223 ****** 15627 1726882489.47169: entering _queue_task() for managed_node1/command 15627 1726882489.47565: worker is 1 (out of 1 available) 15627 1726882489.47578: exiting _queue_task() for managed_node1/command 15627 1726882489.47590: done queuing things up, now waiting for results queue to drain 15627 1726882489.47591: waiting for pending results... 15627 1726882489.48032: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 15627 1726882489.48141: in run() - task 0e448fcc-3ce9-2847-7723-000000000393 15627 1726882489.48160: variable 'ansible_search_path' from source: unknown 15627 1726882489.48172: variable 'ansible_search_path' from source: unknown 15627 1726882489.48216: calling self._execute() 15627 1726882489.48318: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882489.48330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882489.48344: variable 'omit' from source: magic vars 15627 1726882489.48725: variable 'ansible_distribution_major_version' from source: facts 15627 1726882489.48746: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882489.48761: variable 'omit' from source: magic vars 15627 1726882489.48800: variable 'omit' from source: magic vars 15627 1726882489.48902: variable 'interface' from source: set_fact 15627 1726882489.48925: variable 'omit' from source: magic vars 15627 1726882489.48972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882489.49014: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882489.49041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882489.49062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882489.49089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882489.49123: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882489.49132: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882489.49140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882489.49245: Set connection var ansible_timeout to 10 15627 1726882489.49259: Set connection var ansible_shell_executable to /bin/sh 15627 1726882489.49272: Set connection var ansible_connection to ssh 15627 1726882489.49283: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882489.49295: Set connection var ansible_pipelining to False 15627 1726882489.49306: Set connection var ansible_shell_type to sh 15627 1726882489.49333: variable 'ansible_shell_executable' from source: unknown 15627 1726882489.49342: variable 'ansible_connection' from source: unknown 15627 1726882489.49349: variable 'ansible_module_compression' from source: unknown 15627 1726882489.49355: variable 'ansible_shell_type' from source: unknown 15627 1726882489.49362: variable 'ansible_shell_executable' from source: unknown 15627 1726882489.49371: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882489.49378: variable 'ansible_pipelining' from source: unknown 15627 1726882489.49385: variable 'ansible_timeout' from source: unknown 15627 1726882489.49392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882489.49609: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882489.49630: variable 'omit' from source: magic vars 15627 1726882489.49639: starting attempt loop 15627 1726882489.49646: running the handler 15627 1726882489.49669: _low_level_execute_command(): starting 15627 1726882489.49683: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882489.50594: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882489.50612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.50628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.50649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.50697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.50710: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882489.50729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.50748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882489.50761: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882489.50775: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882489.50788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.50802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.50818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.50834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.50845: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882489.50858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.50931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882489.50952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882489.50972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882489.51180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882489.52837: stdout chunk (state=3): >>>/root <<< 15627 1726882489.53021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882489.53025: stdout chunk (state=3): >>><<< 15627 1726882489.53027: stderr chunk (state=3): >>><<< 15627 1726882489.53071: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882489.53075: _low_level_execute_command(): starting 15627 1726882489.53154: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882489.5304885-16864-2351721093604 `" && echo ansible-tmp-1726882489.5304885-16864-2351721093604="` echo /root/.ansible/tmp/ansible-tmp-1726882489.5304885-16864-2351721093604 `" ) && sleep 0' 15627 1726882489.53709: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882489.53724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.53740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.53759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.53803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.53817: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882489.53833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.53852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882489.53868: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882489.53880: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882489.53893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.53908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.53924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.53937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.53948: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882489.53962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.54038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882489.54060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882489.54080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882489.54205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882489.56071: stdout chunk (state=3): >>>ansible-tmp-1726882489.5304885-16864-2351721093604=/root/.ansible/tmp/ansible-tmp-1726882489.5304885-16864-2351721093604 <<< 15627 1726882489.56185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882489.56250: stderr chunk (state=3): >>><<< 15627 1726882489.56253: stdout chunk (state=3): >>><<< 15627 1726882489.56569: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882489.5304885-16864-2351721093604=/root/.ansible/tmp/ansible-tmp-1726882489.5304885-16864-2351721093604 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882489.56572: variable 'ansible_module_compression' from source: unknown 15627 1726882489.56575: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15627 1726882489.56577: variable 'ansible_facts' from source: unknown 15627 1726882489.56579: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882489.5304885-16864-2351721093604/AnsiballZ_command.py 15627 1726882489.56642: Sending initial data 15627 1726882489.56646: Sent initial data (154 bytes) 15627 1726882489.57613: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882489.57628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.57644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.57665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.57712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.57725: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882489.57739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.57759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882489.57776: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882489.57791: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882489.57804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.57818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.57834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.57846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.57857: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882489.57874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.57951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882489.57974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882489.57977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882489.58110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882489.59818: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882489.59907: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882489.60004: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpi89syukt /root/.ansible/tmp/ansible-tmp-1726882489.5304885-16864-2351721093604/AnsiballZ_command.py <<< 15627 1726882489.60095: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882489.61357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882489.61595: stderr chunk (state=3): >>><<< 15627 1726882489.61598: stdout chunk (state=3): >>><<< 15627 1726882489.61600: done transferring module to remote 15627 1726882489.61602: _low_level_execute_command(): starting 15627 1726882489.61608: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882489.5304885-16864-2351721093604/ /root/.ansible/tmp/ansible-tmp-1726882489.5304885-16864-2351721093604/AnsiballZ_command.py && sleep 0' 15627 1726882489.62226: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882489.62241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.62256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.62289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.62331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.62344: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882489.62359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.62388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882489.62450: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882489.62475: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882489.62511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.62527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.62543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.62556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.62571: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882489.62586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.62675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882489.62698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882489.62725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882489.62854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882489.64636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882489.64639: stdout chunk (state=3): >>><<< 15627 1726882489.64642: stderr chunk (state=3): >>><<< 15627 1726882489.64726: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882489.64729: _low_level_execute_command(): starting 15627 1726882489.64732: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882489.5304885-16864-2351721093604/AnsiballZ_command.py && sleep 0' 15627 1726882489.65925: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882489.65939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.65954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.65975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.66015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.66028: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882489.66041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.66057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882489.66071: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882489.66081: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882489.66091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.66103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.66117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.66128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.66140: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882489.66154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.66228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882489.66250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882489.66271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882489.66400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882489.80231: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-20 21:34:49.793441", "end": "2024-09-20 21:34:49.800775", "delta": "0:00:00.007334", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15627 1726882489.81391: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.44.90 closed. <<< 15627 1726882489.81418: stderr chunk (state=3): >>><<< 15627 1726882489.81422: stdout chunk (state=3): >>><<< 15627 1726882489.81572: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-20 21:34:49.793441", "end": "2024-09-20 21:34:49.800775", "delta": "0:00:00.007334", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.44.90 closed. 15627 1726882489.81581: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882489.5304885-16864-2351721093604/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882489.81584: _low_level_execute_command(): starting 15627 1726882489.81587: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882489.5304885-16864-2351721093604/ > /dev/null 2>&1 && sleep 0' 15627 1726882489.82406: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882489.82419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.82443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.82465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.82506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.82520: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882489.82535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.82563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882489.82579: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882489.82591: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882489.82604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882489.82618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882489.82634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882489.82646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882489.82670: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882489.82686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882489.82760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882489.82792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882489.82809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882489.82932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882489.84824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882489.84827: stdout chunk (state=3): >>><<< 15627 1726882489.84830: stderr chunk (state=3): >>><<< 15627 1726882489.84976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882489.84979: handler run complete 15627 1726882489.84982: Evaluated conditional (False): False 15627 1726882489.84984: attempt loop complete, returning result 15627 1726882489.84986: _execute() done 15627 1726882489.84988: dumping result to json 15627 1726882489.84990: done dumping result, returning 15627 1726882489.84992: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [0e448fcc-3ce9-2847-7723-000000000393] 15627 1726882489.84994: sending task result for task 0e448fcc-3ce9-2847-7723-000000000393 15627 1726882489.85068: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000393 15627 1726882489.85072: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "LSR-TST-br31" ], "delta": "0:00:00.007334", "end": "2024-09-20 21:34:49.800775", "rc": 1, "start": "2024-09-20 21:34:49.793441" } STDERR: Cannot find device "LSR-TST-br31" MSG: non-zero return code ...ignoring 15627 1726882489.85336: no more pending results, returning what we have 15627 1726882489.85340: results queue empty 15627 1726882489.85340: checking for any_errors_fatal 15627 1726882489.85342: done checking for any_errors_fatal 15627 1726882489.85343: checking for max_fail_percentage 15627 1726882489.85344: done checking for max_fail_percentage 15627 1726882489.85345: checking to see if all hosts have failed and the running result is not ok 15627 1726882489.85346: done checking to see if all hosts have failed 15627 1726882489.85347: getting the remaining hosts for this loop 15627 1726882489.85348: done getting the remaining hosts for this loop 15627 1726882489.85352: getting the next task for host managed_node1 15627 1726882489.85360: done getting next task for host managed_node1 15627 1726882489.85362: ^ task is: TASK: meta (flush_handlers) 15627 1726882489.85366: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882489.85370: getting variables 15627 1726882489.85372: in VariableManager get_vars() 15627 1726882489.85404: Calling all_inventory to load vars for managed_node1 15627 1726882489.85407: Calling groups_inventory to load vars for managed_node1 15627 1726882489.85410: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882489.85421: Calling all_plugins_play to load vars for managed_node1 15627 1726882489.85423: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882489.85426: Calling groups_plugins_play to load vars for managed_node1 15627 1726882489.87298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882489.89242: done with get_vars() 15627 1726882489.89262: done getting variables 15627 1726882489.89335: in VariableManager get_vars() 15627 1726882489.89344: Calling all_inventory to load vars for managed_node1 15627 1726882489.89346: Calling groups_inventory to load vars for managed_node1 15627 1726882489.89349: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882489.89353: Calling all_plugins_play to load vars for managed_node1 15627 1726882489.89359: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882489.89363: Calling groups_plugins_play to load vars for managed_node1 15627 1726882489.92211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882489.94828: done with get_vars() 15627 1726882489.94861: done queuing things up, now waiting for results queue to drain 15627 1726882489.94865: results queue empty 15627 1726882489.94866: checking for any_errors_fatal 15627 1726882489.94870: done checking for any_errors_fatal 15627 1726882489.94871: checking for max_fail_percentage 15627 1726882489.94872: done checking for max_fail_percentage 15627 1726882489.94873: checking to see if all hosts have failed and the running result is not ok 15627 1726882489.94874: done checking to see if all hosts have failed 15627 1726882489.94875: getting the remaining hosts for this loop 15627 1726882489.94876: done getting the remaining hosts for this loop 15627 1726882489.94879: getting the next task for host managed_node1 15627 1726882489.94883: done getting next task for host managed_node1 15627 1726882489.94885: ^ task is: TASK: meta (flush_handlers) 15627 1726882489.94887: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882489.94890: getting variables 15627 1726882489.94891: in VariableManager get_vars() 15627 1726882489.94906: Calling all_inventory to load vars for managed_node1 15627 1726882489.94909: Calling groups_inventory to load vars for managed_node1 15627 1726882489.94911: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882489.94916: Calling all_plugins_play to load vars for managed_node1 15627 1726882489.94918: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882489.94921: Calling groups_plugins_play to load vars for managed_node1 15627 1726882489.96223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882489.98081: done with get_vars() 15627 1726882489.98101: done getting variables 15627 1726882489.98149: in VariableManager get_vars() 15627 1726882489.98197: Calling all_inventory to load vars for managed_node1 15627 1726882489.98200: Calling groups_inventory to load vars for managed_node1 15627 1726882489.98202: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882489.98207: Calling all_plugins_play to load vars for managed_node1 15627 1726882489.98209: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882489.98211: Calling groups_plugins_play to load vars for managed_node1 15627 1726882490.00281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882490.02063: done with get_vars() 15627 1726882490.02100: done queuing things up, now waiting for results queue to drain 15627 1726882490.02103: results queue empty 15627 1726882490.02103: checking for any_errors_fatal 15627 1726882490.02105: done checking for any_errors_fatal 15627 1726882490.02105: checking for max_fail_percentage 15627 1726882490.02106: done checking for max_fail_percentage 15627 1726882490.02107: checking to see if all hosts have failed and the running result is not ok 15627 1726882490.02108: done checking to see if all hosts have failed 15627 1726882490.02109: getting the remaining hosts for this loop 15627 1726882490.02110: done getting the remaining hosts for this loop 15627 1726882490.02113: getting the next task for host managed_node1 15627 1726882490.02116: done getting next task for host managed_node1 15627 1726882490.02117: ^ task is: None 15627 1726882490.02118: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882490.02120: done queuing things up, now waiting for results queue to drain 15627 1726882490.02121: results queue empty 15627 1726882490.02121: checking for any_errors_fatal 15627 1726882490.02122: done checking for any_errors_fatal 15627 1726882490.02123: checking for max_fail_percentage 15627 1726882490.02124: done checking for max_fail_percentage 15627 1726882490.02124: checking to see if all hosts have failed and the running result is not ok 15627 1726882490.02125: done checking to see if all hosts have failed 15627 1726882490.02126: getting the next task for host managed_node1 15627 1726882490.02129: done getting next task for host managed_node1 15627 1726882490.02130: ^ task is: None 15627 1726882490.02131: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882490.02202: in VariableManager get_vars() 15627 1726882490.02238: done with get_vars() 15627 1726882490.02245: in VariableManager get_vars() 15627 1726882490.02259: done with get_vars() 15627 1726882490.02274: variable 'omit' from source: magic vars 15627 1726882490.02483: variable 'profile' from source: play vars 15627 1726882490.02703: in VariableManager get_vars() 15627 1726882490.02725: done with get_vars() 15627 1726882490.02776: variable 'omit' from source: magic vars 15627 1726882490.02899: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 15627 1726882490.03910: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15627 1726882490.03934: getting the remaining hosts for this loop 15627 1726882490.03936: done getting the remaining hosts for this loop 15627 1726882490.03938: getting the next task for host managed_node1 15627 1726882490.03941: done getting next task for host managed_node1 15627 1726882490.03943: ^ task is: TASK: Gathering Facts 15627 1726882490.03944: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882490.03946: getting variables 15627 1726882490.03947: in VariableManager get_vars() 15627 1726882490.03967: Calling all_inventory to load vars for managed_node1 15627 1726882490.03970: Calling groups_inventory to load vars for managed_node1 15627 1726882490.03973: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882490.03981: Calling all_plugins_play to load vars for managed_node1 15627 1726882490.03996: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882490.04000: Calling groups_plugins_play to load vars for managed_node1 15627 1726882490.06184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882490.07990: done with get_vars() 15627 1726882490.08021: done getting variables 15627 1726882490.08075: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 21:34:50 -0400 (0:00:00.609) 0:00:29.832 ****** 15627 1726882490.08114: entering _queue_task() for managed_node1/gather_facts 15627 1726882490.08459: worker is 1 (out of 1 available) 15627 1726882490.08472: exiting _queue_task() for managed_node1/gather_facts 15627 1726882490.08484: done queuing things up, now waiting for results queue to drain 15627 1726882490.08485: waiting for pending results... 15627 1726882490.08791: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15627 1726882490.08908: in run() - task 0e448fcc-3ce9-2847-7723-0000000003a1 15627 1726882490.08936: variable 'ansible_search_path' from source: unknown 15627 1726882490.09010: calling self._execute() 15627 1726882490.09189: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882490.09224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882490.09268: variable 'omit' from source: magic vars 15627 1726882490.09788: variable 'ansible_distribution_major_version' from source: facts 15627 1726882490.09805: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882490.09844: variable 'omit' from source: magic vars 15627 1726882490.09896: variable 'omit' from source: magic vars 15627 1726882490.09983: variable 'omit' from source: magic vars 15627 1726882490.10032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882490.10133: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882490.10180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882490.10215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882490.10234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882490.10307: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882490.10321: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882490.10329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882490.10488: Set connection var ansible_timeout to 10 15627 1726882490.10511: Set connection var ansible_shell_executable to /bin/sh 15627 1726882490.10525: Set connection var ansible_connection to ssh 15627 1726882490.10539: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882490.10547: Set connection var ansible_pipelining to False 15627 1726882490.10553: Set connection var ansible_shell_type to sh 15627 1726882490.10581: variable 'ansible_shell_executable' from source: unknown 15627 1726882490.10589: variable 'ansible_connection' from source: unknown 15627 1726882490.10596: variable 'ansible_module_compression' from source: unknown 15627 1726882490.10602: variable 'ansible_shell_type' from source: unknown 15627 1726882490.10610: variable 'ansible_shell_executable' from source: unknown 15627 1726882490.10618: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882490.10632: variable 'ansible_pipelining' from source: unknown 15627 1726882490.10645: variable 'ansible_timeout' from source: unknown 15627 1726882490.10653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882490.10905: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882490.10923: variable 'omit' from source: magic vars 15627 1726882490.10932: starting attempt loop 15627 1726882490.10943: running the handler 15627 1726882490.10971: variable 'ansible_facts' from source: unknown 15627 1726882490.10998: _low_level_execute_command(): starting 15627 1726882490.11010: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882490.11877: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882490.11892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882490.11905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882490.11923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882490.11977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882490.11990: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882490.12010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882490.12030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882490.12051: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882490.12066: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882490.12080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882490.12095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882490.12113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882490.12127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882490.12140: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882490.12162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882490.12236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882490.12259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882490.12286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882490.12416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882490.14097: stdout chunk (state=3): >>>/root <<< 15627 1726882490.14277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882490.14281: stdout chunk (state=3): >>><<< 15627 1726882490.14283: stderr chunk (state=3): >>><<< 15627 1726882490.14401: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882490.14405: _low_level_execute_command(): starting 15627 1726882490.14408: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882490.1430633-16895-201784563944303 `" && echo ansible-tmp-1726882490.1430633-16895-201784563944303="` echo /root/.ansible/tmp/ansible-tmp-1726882490.1430633-16895-201784563944303 `" ) && sleep 0' 15627 1726882490.15119: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882490.15133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882490.15156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882490.15179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882490.15227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882490.15240: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882490.15256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882490.15279: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882490.15291: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882490.15306: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882490.15319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882490.15345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882490.15362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882490.15381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882490.15398: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882490.15415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882490.15503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882490.15524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882490.15541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882490.15672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882490.17531: stdout chunk (state=3): >>>ansible-tmp-1726882490.1430633-16895-201784563944303=/root/.ansible/tmp/ansible-tmp-1726882490.1430633-16895-201784563944303 <<< 15627 1726882490.17682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882490.17777: stderr chunk (state=3): >>><<< 15627 1726882490.17786: stdout chunk (state=3): >>><<< 15627 1726882490.17816: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882490.1430633-16895-201784563944303=/root/.ansible/tmp/ansible-tmp-1726882490.1430633-16895-201784563944303 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882490.17864: variable 'ansible_module_compression' from source: unknown 15627 1726882490.17907: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15627 1726882490.17976: variable 'ansible_facts' from source: unknown 15627 1726882490.18161: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882490.1430633-16895-201784563944303/AnsiballZ_setup.py 15627 1726882490.18633: Sending initial data 15627 1726882490.18642: Sent initial data (154 bytes) 15627 1726882490.19179: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882490.19199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882490.19203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882490.19239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882490.19244: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882490.19250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882490.19337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882490.19347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882490.19383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882490.19615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882490.21316: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15627 1726882490.21320: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882490.21418: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 15627 1726882490.21422: stderr chunk (state=3): >>>debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882490.21523: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpo8d50eyv /root/.ansible/tmp/ansible-tmp-1726882490.1430633-16895-201784563944303/AnsiballZ_setup.py <<< 15627 1726882490.21621: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882490.23745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882490.23834: stderr chunk (state=3): >>><<< 15627 1726882490.23837: stdout chunk (state=3): >>><<< 15627 1726882490.23857: done transferring module to remote 15627 1726882490.23877: _low_level_execute_command(): starting 15627 1726882490.23880: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882490.1430633-16895-201784563944303/ /root/.ansible/tmp/ansible-tmp-1726882490.1430633-16895-201784563944303/AnsiballZ_setup.py && sleep 0' 15627 1726882490.24360: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882490.24366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882490.24380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882490.24405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882490.24441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882490.24444: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882490.24453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882490.24484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882490.24487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882490.24553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882490.24556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882490.24657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882490.26401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882490.26479: stderr chunk (state=3): >>><<< 15627 1726882490.26490: stdout chunk (state=3): >>><<< 15627 1726882490.26569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882490.26572: _low_level_execute_command(): starting 15627 1726882490.26574: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882490.1430633-16895-201784563944303/AnsiballZ_setup.py && sleep 0' 15627 1726882490.27132: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882490.27156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882490.27173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882490.27199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882490.27242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882490.27257: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882490.27282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882490.27302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882490.27312: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882490.27321: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882490.27330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882490.27340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882490.27361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882490.27378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882490.27388: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882490.27399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882490.27483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882490.27504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882490.27514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882490.27642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882490.78412: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINU<<< 15627 1726882490.78427: stdout chunk (state=3): >>>X_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_loadavg": {"1m": 0.5, "5m": 0.38, "15m": 0.2}, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "50", "epoch": "1726882490", "epoch_int": "1726882490", "date": "2024-09-20", "time": "21:34:50", "iso8601_micro": "2024-09-21T01:34:50.526217Z", "iso8601": "2024-09-21T01:34:50Z", "iso8601_basic": "20240920T213450526217", "iso8601_basic_short": "20240920T213450", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2810, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 722, "free": 2810}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 648, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241344512, "block_size": 4096, "block_total": 65519355, "block_available": 64512047, "block_used": 1007308, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15627 1726882490.80252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882490.80256: stdout chunk (state=3): >>><<< 15627 1726882490.80274: stderr chunk (state=3): >>><<< 15627 1726882490.80309: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_loadavg": {"1m": 0.5, "5m": 0.38, "15m": 0.2}, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "50", "epoch": "1726882490", "epoch_int": "1726882490", "date": "2024-09-20", "time": "21:34:50", "iso8601_micro": "2024-09-21T01:34:50.526217Z", "iso8601": "2024-09-21T01:34:50Z", "iso8601_basic": "20240920T213450526217", "iso8601_basic_short": "20240920T213450", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2810, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 722, "free": 2810}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 648, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241344512, "block_size": 4096, "block_total": 65519355, "block_available": 64512047, "block_used": 1007308, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882490.80679: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882490.1430633-16895-201784563944303/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882490.80698: _low_level_execute_command(): starting 15627 1726882490.80701: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882490.1430633-16895-201784563944303/ > /dev/null 2>&1 && sleep 0' 15627 1726882490.82305: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882490.82321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882490.82337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882490.82356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882490.82407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882490.82420: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882490.82435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882490.82453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882490.82469: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882490.82483: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882490.82502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882490.82517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882490.82626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882490.82640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882490.82652: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882490.82669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882490.82748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882490.82774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882490.82790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882490.82920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882490.84914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882490.84918: stdout chunk (state=3): >>><<< 15627 1726882490.84920: stderr chunk (state=3): >>><<< 15627 1726882490.85271: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882490.85274: handler run complete 15627 1726882490.85276: variable 'ansible_facts' from source: unknown 15627 1726882490.85279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882490.85526: variable 'ansible_facts' from source: unknown 15627 1726882490.85731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882490.86041: attempt loop complete, returning result 15627 1726882490.86050: _execute() done 15627 1726882490.86056: dumping result to json 15627 1726882490.86092: done dumping result, returning 15627 1726882490.86103: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-2847-7723-0000000003a1] 15627 1726882490.86112: sending task result for task 0e448fcc-3ce9-2847-7723-0000000003a1 ok: [managed_node1] 15627 1726882490.86937: no more pending results, returning what we have 15627 1726882490.86940: results queue empty 15627 1726882490.86942: checking for any_errors_fatal 15627 1726882490.86943: done checking for any_errors_fatal 15627 1726882490.86944: checking for max_fail_percentage 15627 1726882490.86946: done checking for max_fail_percentage 15627 1726882490.86946: checking to see if all hosts have failed and the running result is not ok 15627 1726882490.86947: done checking to see if all hosts have failed 15627 1726882490.86948: getting the remaining hosts for this loop 15627 1726882490.86950: done getting the remaining hosts for this loop 15627 1726882490.86954: getting the next task for host managed_node1 15627 1726882490.86962: done getting next task for host managed_node1 15627 1726882490.86965: ^ task is: TASK: meta (flush_handlers) 15627 1726882490.86967: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882490.86971: getting variables 15627 1726882490.86973: in VariableManager get_vars() 15627 1726882490.87005: Calling all_inventory to load vars for managed_node1 15627 1726882490.87008: Calling groups_inventory to load vars for managed_node1 15627 1726882490.87010: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882490.87021: Calling all_plugins_play to load vars for managed_node1 15627 1726882490.87024: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882490.87027: Calling groups_plugins_play to load vars for managed_node1 15627 1726882490.87881: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000003a1 15627 1726882490.87884: WORKER PROCESS EXITING 15627 1726882490.89822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882490.93490: done with get_vars() 15627 1726882490.93512: done getting variables 15627 1726882490.93586: in VariableManager get_vars() 15627 1726882490.93599: Calling all_inventory to load vars for managed_node1 15627 1726882490.93601: Calling groups_inventory to load vars for managed_node1 15627 1726882490.93603: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882490.93608: Calling all_plugins_play to load vars for managed_node1 15627 1726882490.93610: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882490.93618: Calling groups_plugins_play to load vars for managed_node1 15627 1726882491.03338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882491.06691: done with get_vars() 15627 1726882491.06717: done queuing things up, now waiting for results queue to drain 15627 1726882491.06719: results queue empty 15627 1726882491.06720: checking for any_errors_fatal 15627 1726882491.06724: done checking for any_errors_fatal 15627 1726882491.06724: checking for max_fail_percentage 15627 1726882491.06725: done checking for max_fail_percentage 15627 1726882491.06726: checking to see if all hosts have failed and the running result is not ok 15627 1726882491.06727: done checking to see if all hosts have failed 15627 1726882491.06728: getting the remaining hosts for this loop 15627 1726882491.06728: done getting the remaining hosts for this loop 15627 1726882491.06731: getting the next task for host managed_node1 15627 1726882491.06734: done getting next task for host managed_node1 15627 1726882491.06737: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15627 1726882491.06738: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882491.06746: getting variables 15627 1726882491.06747: in VariableManager get_vars() 15627 1726882491.06759: Calling all_inventory to load vars for managed_node1 15627 1726882491.06761: Calling groups_inventory to load vars for managed_node1 15627 1726882491.06765: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882491.06769: Calling all_plugins_play to load vars for managed_node1 15627 1726882491.06771: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882491.06773: Calling groups_plugins_play to load vars for managed_node1 15627 1726882491.09440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882491.12522: done with get_vars() 15627 1726882491.12549: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:34:51 -0400 (0:00:01.045) 0:00:30.877 ****** 15627 1726882491.12621: entering _queue_task() for managed_node1/include_tasks 15627 1726882491.13615: worker is 1 (out of 1 available) 15627 1726882491.13627: exiting _queue_task() for managed_node1/include_tasks 15627 1726882491.13637: done queuing things up, now waiting for results queue to drain 15627 1726882491.13638: waiting for pending results... 15627 1726882491.14559: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15627 1726882491.14690: in run() - task 0e448fcc-3ce9-2847-7723-00000000005a 15627 1726882491.14779: variable 'ansible_search_path' from source: unknown 15627 1726882491.14787: variable 'ansible_search_path' from source: unknown 15627 1726882491.14826: calling self._execute() 15627 1726882491.15069: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882491.15207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882491.15224: variable 'omit' from source: magic vars 15627 1726882491.15938: variable 'ansible_distribution_major_version' from source: facts 15627 1726882491.16036: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882491.16079: _execute() done 15627 1726882491.16874: dumping result to json 15627 1726882491.16881: done dumping result, returning 15627 1726882491.16892: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-2847-7723-00000000005a] 15627 1726882491.16901: sending task result for task 0e448fcc-3ce9-2847-7723-00000000005a 15627 1726882491.17102: no more pending results, returning what we have 15627 1726882491.17107: in VariableManager get_vars() 15627 1726882491.17149: Calling all_inventory to load vars for managed_node1 15627 1726882491.17152: Calling groups_inventory to load vars for managed_node1 15627 1726882491.17154: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882491.17169: Calling all_plugins_play to load vars for managed_node1 15627 1726882491.17172: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882491.17175: Calling groups_plugins_play to load vars for managed_node1 15627 1726882491.18771: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000005a 15627 1726882491.18774: WORKER PROCESS EXITING 15627 1726882491.20444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882491.23682: done with get_vars() 15627 1726882491.23706: variable 'ansible_search_path' from source: unknown 15627 1726882491.23707: variable 'ansible_search_path' from source: unknown 15627 1726882491.23735: we have included files to process 15627 1726882491.23736: generating all_blocks data 15627 1726882491.23738: done generating all_blocks data 15627 1726882491.23739: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15627 1726882491.23740: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15627 1726882491.23742: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15627 1726882491.25214: done processing included file 15627 1726882491.25216: iterating over new_blocks loaded from include file 15627 1726882491.25217: in VariableManager get_vars() 15627 1726882491.25241: done with get_vars() 15627 1726882491.25243: filtering new block on tags 15627 1726882491.25259: done filtering new block on tags 15627 1726882491.25262: in VariableManager get_vars() 15627 1726882491.25285: done with get_vars() 15627 1726882491.25287: filtering new block on tags 15627 1726882491.25306: done filtering new block on tags 15627 1726882491.25309: in VariableManager get_vars() 15627 1726882491.25329: done with get_vars() 15627 1726882491.25331: filtering new block on tags 15627 1726882491.25347: done filtering new block on tags 15627 1726882491.25349: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 15627 1726882491.25354: extending task lists for all hosts with included blocks 15627 1726882491.27161: done extending task lists 15627 1726882491.27163: done processing included files 15627 1726882491.27165: results queue empty 15627 1726882491.27166: checking for any_errors_fatal 15627 1726882491.27168: done checking for any_errors_fatal 15627 1726882491.27169: checking for max_fail_percentage 15627 1726882491.27170: done checking for max_fail_percentage 15627 1726882491.27170: checking to see if all hosts have failed and the running result is not ok 15627 1726882491.27171: done checking to see if all hosts have failed 15627 1726882491.27172: getting the remaining hosts for this loop 15627 1726882491.27173: done getting the remaining hosts for this loop 15627 1726882491.27176: getting the next task for host managed_node1 15627 1726882491.27180: done getting next task for host managed_node1 15627 1726882491.27183: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15627 1726882491.27186: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882491.27194: getting variables 15627 1726882491.27195: in VariableManager get_vars() 15627 1726882491.27208: Calling all_inventory to load vars for managed_node1 15627 1726882491.27211: Calling groups_inventory to load vars for managed_node1 15627 1726882491.27213: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882491.27218: Calling all_plugins_play to load vars for managed_node1 15627 1726882491.27220: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882491.27223: Calling groups_plugins_play to load vars for managed_node1 15627 1726882491.31138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882491.34232: done with get_vars() 15627 1726882491.34260: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:34:51 -0400 (0:00:00.217) 0:00:31.094 ****** 15627 1726882491.34335: entering _queue_task() for managed_node1/setup 15627 1726882491.35361: worker is 1 (out of 1 available) 15627 1726882491.35375: exiting _queue_task() for managed_node1/setup 15627 1726882491.35388: done queuing things up, now waiting for results queue to drain 15627 1726882491.35390: waiting for pending results... 15627 1726882491.36282: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15627 1726882491.36675: in run() - task 0e448fcc-3ce9-2847-7723-0000000003e2 15627 1726882491.36744: variable 'ansible_search_path' from source: unknown 15627 1726882491.36753: variable 'ansible_search_path' from source: unknown 15627 1726882491.36891: calling self._execute() 15627 1726882491.37089: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882491.37102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882491.37115: variable 'omit' from source: magic vars 15627 1726882491.37924: variable 'ansible_distribution_major_version' from source: facts 15627 1726882491.37983: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882491.38540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882491.42878: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882491.42973: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882491.43015: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882491.43059: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882491.43097: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882491.43186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882491.43219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882491.43249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882491.43308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882491.43328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882491.43392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882491.43420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882491.43448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882491.43506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882491.43527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882491.43696: variable '__network_required_facts' from source: role '' defaults 15627 1726882491.43716: variable 'ansible_facts' from source: unknown 15627 1726882491.44624: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15627 1726882491.44633: when evaluation is False, skipping this task 15627 1726882491.44657: _execute() done 15627 1726882491.44673: dumping result to json 15627 1726882491.44680: done dumping result, returning 15627 1726882491.44692: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-2847-7723-0000000003e2] 15627 1726882491.44705: sending task result for task 0e448fcc-3ce9-2847-7723-0000000003e2 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882491.44858: no more pending results, returning what we have 15627 1726882491.44865: results queue empty 15627 1726882491.44866: checking for any_errors_fatal 15627 1726882491.44868: done checking for any_errors_fatal 15627 1726882491.44869: checking for max_fail_percentage 15627 1726882491.44871: done checking for max_fail_percentage 15627 1726882491.44872: checking to see if all hosts have failed and the running result is not ok 15627 1726882491.44873: done checking to see if all hosts have failed 15627 1726882491.44874: getting the remaining hosts for this loop 15627 1726882491.44875: done getting the remaining hosts for this loop 15627 1726882491.44879: getting the next task for host managed_node1 15627 1726882491.44891: done getting next task for host managed_node1 15627 1726882491.44895: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15627 1726882491.44898: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882491.44910: getting variables 15627 1726882491.44912: in VariableManager get_vars() 15627 1726882491.44958: Calling all_inventory to load vars for managed_node1 15627 1726882491.44962: Calling groups_inventory to load vars for managed_node1 15627 1726882491.44966: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882491.44977: Calling all_plugins_play to load vars for managed_node1 15627 1726882491.44980: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882491.44984: Calling groups_plugins_play to load vars for managed_node1 15627 1726882491.46038: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000003e2 15627 1726882491.46041: WORKER PROCESS EXITING 15627 1726882491.47482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882491.51289: done with get_vars() 15627 1726882491.51314: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:34:51 -0400 (0:00:00.170) 0:00:31.265 ****** 15627 1726882491.51414: entering _queue_task() for managed_node1/stat 15627 1726882491.52134: worker is 1 (out of 1 available) 15627 1726882491.52146: exiting _queue_task() for managed_node1/stat 15627 1726882491.52158: done queuing things up, now waiting for results queue to drain 15627 1726882491.52160: waiting for pending results... 15627 1726882491.53343: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15627 1726882491.53488: in run() - task 0e448fcc-3ce9-2847-7723-0000000003e4 15627 1726882491.53508: variable 'ansible_search_path' from source: unknown 15627 1726882491.53515: variable 'ansible_search_path' from source: unknown 15627 1726882491.53558: calling self._execute() 15627 1726882491.53660: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882491.53675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882491.53691: variable 'omit' from source: magic vars 15627 1726882491.54056: variable 'ansible_distribution_major_version' from source: facts 15627 1726882491.54683: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882491.54856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882491.55136: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882491.55815: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882491.55852: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882491.55898: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882491.56015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882491.56045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882491.56083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882491.56119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882491.56217: variable '__network_is_ostree' from source: set_fact 15627 1726882491.56228: Evaluated conditional (not __network_is_ostree is defined): False 15627 1726882491.56234: when evaluation is False, skipping this task 15627 1726882491.56239: _execute() done 15627 1726882491.56244: dumping result to json 15627 1726882491.56249: done dumping result, returning 15627 1726882491.56260: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-2847-7723-0000000003e4] 15627 1726882491.56873: sending task result for task 0e448fcc-3ce9-2847-7723-0000000003e4 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15627 1726882491.57023: no more pending results, returning what we have 15627 1726882491.57027: results queue empty 15627 1726882491.57028: checking for any_errors_fatal 15627 1726882491.57035: done checking for any_errors_fatal 15627 1726882491.57036: checking for max_fail_percentage 15627 1726882491.57037: done checking for max_fail_percentage 15627 1726882491.57038: checking to see if all hosts have failed and the running result is not ok 15627 1726882491.57039: done checking to see if all hosts have failed 15627 1726882491.57040: getting the remaining hosts for this loop 15627 1726882491.57042: done getting the remaining hosts for this loop 15627 1726882491.57045: getting the next task for host managed_node1 15627 1726882491.57052: done getting next task for host managed_node1 15627 1726882491.57056: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15627 1726882491.57058: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882491.57077: getting variables 15627 1726882491.57078: in VariableManager get_vars() 15627 1726882491.57115: Calling all_inventory to load vars for managed_node1 15627 1726882491.57118: Calling groups_inventory to load vars for managed_node1 15627 1726882491.57120: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882491.57131: Calling all_plugins_play to load vars for managed_node1 15627 1726882491.57134: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882491.57137: Calling groups_plugins_play to load vars for managed_node1 15627 1726882491.58165: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000003e4 15627 1726882491.58171: WORKER PROCESS EXITING 15627 1726882491.59582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882491.63187: done with get_vars() 15627 1726882491.63215: done getting variables 15627 1726882491.63388: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:34:51 -0400 (0:00:00.120) 0:00:31.385 ****** 15627 1726882491.63423: entering _queue_task() for managed_node1/set_fact 15627 1726882491.64368: worker is 1 (out of 1 available) 15627 1726882491.64383: exiting _queue_task() for managed_node1/set_fact 15627 1726882491.64395: done queuing things up, now waiting for results queue to drain 15627 1726882491.64396: waiting for pending results... 15627 1726882491.65828: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15627 1726882491.65965: in run() - task 0e448fcc-3ce9-2847-7723-0000000003e5 15627 1726882491.65985: variable 'ansible_search_path' from source: unknown 15627 1726882491.65992: variable 'ansible_search_path' from source: unknown 15627 1726882491.66030: calling self._execute() 15627 1726882491.66124: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882491.66778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882491.66793: variable 'omit' from source: magic vars 15627 1726882491.67148: variable 'ansible_distribution_major_version' from source: facts 15627 1726882491.67169: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882491.67329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882491.67580: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882491.68317: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882491.68356: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882491.68396: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882491.68501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882491.68531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882491.68562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882491.68597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882491.68685: variable '__network_is_ostree' from source: set_fact 15627 1726882491.68697: Evaluated conditional (not __network_is_ostree is defined): False 15627 1726882491.68705: when evaluation is False, skipping this task 15627 1726882491.68712: _execute() done 15627 1726882491.68718: dumping result to json 15627 1726882491.68725: done dumping result, returning 15627 1726882491.68736: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-2847-7723-0000000003e5] 15627 1726882491.68746: sending task result for task 0e448fcc-3ce9-2847-7723-0000000003e5 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15627 1726882491.68890: no more pending results, returning what we have 15627 1726882491.68893: results queue empty 15627 1726882491.68894: checking for any_errors_fatal 15627 1726882491.68899: done checking for any_errors_fatal 15627 1726882491.68900: checking for max_fail_percentage 15627 1726882491.68901: done checking for max_fail_percentage 15627 1726882491.68902: checking to see if all hosts have failed and the running result is not ok 15627 1726882491.68903: done checking to see if all hosts have failed 15627 1726882491.68904: getting the remaining hosts for this loop 15627 1726882491.68905: done getting the remaining hosts for this loop 15627 1726882491.68909: getting the next task for host managed_node1 15627 1726882491.68918: done getting next task for host managed_node1 15627 1726882491.68922: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15627 1726882491.68924: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882491.68936: getting variables 15627 1726882491.68938: in VariableManager get_vars() 15627 1726882491.68975: Calling all_inventory to load vars for managed_node1 15627 1726882491.68978: Calling groups_inventory to load vars for managed_node1 15627 1726882491.68980: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882491.68990: Calling all_plugins_play to load vars for managed_node1 15627 1726882491.68993: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882491.68995: Calling groups_plugins_play to load vars for managed_node1 15627 1726882491.69971: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000003e5 15627 1726882491.69974: WORKER PROCESS EXITING 15627 1726882491.71784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882491.75143: done with get_vars() 15627 1726882491.75373: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:34:51 -0400 (0:00:00.120) 0:00:31.506 ****** 15627 1726882491.75473: entering _queue_task() for managed_node1/service_facts 15627 1726882491.76489: worker is 1 (out of 1 available) 15627 1726882491.76501: exiting _queue_task() for managed_node1/service_facts 15627 1726882491.76512: done queuing things up, now waiting for results queue to drain 15627 1726882491.76513: waiting for pending results... 15627 1726882491.77522: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 15627 1726882491.78292: in run() - task 0e448fcc-3ce9-2847-7723-0000000003e7 15627 1726882491.78313: variable 'ansible_search_path' from source: unknown 15627 1726882491.78321: variable 'ansible_search_path' from source: unknown 15627 1726882491.78361: calling self._execute() 15627 1726882491.78453: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882491.78469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882491.78485: variable 'omit' from source: magic vars 15627 1726882491.78838: variable 'ansible_distribution_major_version' from source: facts 15627 1726882491.79483: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882491.79495: variable 'omit' from source: magic vars 15627 1726882491.79554: variable 'omit' from source: magic vars 15627 1726882491.79595: variable 'omit' from source: magic vars 15627 1726882491.79637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882491.79679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882491.79705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882491.79727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882491.79743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882491.79779: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882491.79787: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882491.79798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882491.79899: Set connection var ansible_timeout to 10 15627 1726882491.79913: Set connection var ansible_shell_executable to /bin/sh 15627 1726882491.79922: Set connection var ansible_connection to ssh 15627 1726882491.79931: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882491.79940: Set connection var ansible_pipelining to False 15627 1726882491.79946: Set connection var ansible_shell_type to sh 15627 1726882491.79976: variable 'ansible_shell_executable' from source: unknown 15627 1726882491.79984: variable 'ansible_connection' from source: unknown 15627 1726882491.79991: variable 'ansible_module_compression' from source: unknown 15627 1726882491.79997: variable 'ansible_shell_type' from source: unknown 15627 1726882491.80003: variable 'ansible_shell_executable' from source: unknown 15627 1726882491.80010: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882491.80018: variable 'ansible_pipelining' from source: unknown 15627 1726882491.80024: variable 'ansible_timeout' from source: unknown 15627 1726882491.80031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882491.80222: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882491.80880: variable 'omit' from source: magic vars 15627 1726882491.80890: starting attempt loop 15627 1726882491.80898: running the handler 15627 1726882491.80915: _low_level_execute_command(): starting 15627 1726882491.80930: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882491.82731: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882491.82735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882491.82891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882491.82895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882491.82897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882491.82954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882491.83088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882491.83091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882491.83205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882491.84898: stdout chunk (state=3): >>>/root <<< 15627 1726882491.85001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882491.85088: stderr chunk (state=3): >>><<< 15627 1726882491.85091: stdout chunk (state=3): >>><<< 15627 1726882491.85209: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882491.85213: _low_level_execute_command(): starting 15627 1726882491.85216: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882491.8511202-16992-254062204945562 `" && echo ansible-tmp-1726882491.8511202-16992-254062204945562="` echo /root/.ansible/tmp/ansible-tmp-1726882491.8511202-16992-254062204945562 `" ) && sleep 0' 15627 1726882491.86747: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882491.86750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882491.86788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882491.86799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882491.86802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882491.86973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882491.86995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882491.86998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882491.87093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882491.88967: stdout chunk (state=3): >>>ansible-tmp-1726882491.8511202-16992-254062204945562=/root/.ansible/tmp/ansible-tmp-1726882491.8511202-16992-254062204945562 <<< 15627 1726882491.89072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882491.89139: stderr chunk (state=3): >>><<< 15627 1726882491.89142: stdout chunk (state=3): >>><<< 15627 1726882491.89171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882491.8511202-16992-254062204945562=/root/.ansible/tmp/ansible-tmp-1726882491.8511202-16992-254062204945562 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882491.89372: variable 'ansible_module_compression' from source: unknown 15627 1726882491.89375: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15627 1726882491.89377: variable 'ansible_facts' from source: unknown 15627 1726882491.89379: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882491.8511202-16992-254062204945562/AnsiballZ_service_facts.py 15627 1726882491.90048: Sending initial data 15627 1726882491.90052: Sent initial data (162 bytes) 15627 1726882491.92606: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882491.92613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882491.92652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882491.92655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882491.92658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882491.92835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882491.92852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882491.92964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882491.94693: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882491.94786: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882491.94886: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpge2ou7ay /root/.ansible/tmp/ansible-tmp-1726882491.8511202-16992-254062204945562/AnsiballZ_service_facts.py <<< 15627 1726882491.94981: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882491.96556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882491.96769: stderr chunk (state=3): >>><<< 15627 1726882491.96773: stdout chunk (state=3): >>><<< 15627 1726882491.96775: done transferring module to remote 15627 1726882491.96777: _low_level_execute_command(): starting 15627 1726882491.96779: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882491.8511202-16992-254062204945562/ /root/.ansible/tmp/ansible-tmp-1726882491.8511202-16992-254062204945562/AnsiballZ_service_facts.py && sleep 0' 15627 1726882491.97840: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882491.97857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882491.97877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882491.97896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882491.97945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882491.97962: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882491.97982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882491.98001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882491.98013: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882491.98029: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882491.98044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882491.98061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882491.98082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882491.98095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882491.98112: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882491.98125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882491.98213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882491.98230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882491.98249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882491.98484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882492.00270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882492.00291: stdout chunk (state=3): >>><<< 15627 1726882492.00295: stderr chunk (state=3): >>><<< 15627 1726882492.00393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882492.00397: _low_level_execute_command(): starting 15627 1726882492.00399: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882491.8511202-16992-254062204945562/AnsiballZ_service_facts.py && sleep 0' 15627 1726882492.01091: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882492.01110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882492.01123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882492.01139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882492.01187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882492.01199: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882492.01218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882492.01234: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882492.01244: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882492.01256: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882492.01271: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882492.01283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882492.01301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882492.01317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882492.01328: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882492.01340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882492.01424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882492.01444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882492.01460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882492.01642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882493.32839: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.serv<<< 15627 1726882493.32856: stdout chunk (state=3): >>>ice", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-f<<< 15627 1726882493.32879: stdout chunk (state=3): >>>ound", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "s<<< 15627 1726882493.32915: stdout chunk (state=3): >>>tatic", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alia<<< 15627 1726882493.32920: stdout chunk (state=3): >>>s", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper<<< 15627 1726882493.32928: stdout chunk (state=3): >>>-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15627 1726882493.34179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882493.34235: stderr chunk (state=3): >>><<< 15627 1726882493.34239: stdout chunk (state=3): >>><<< 15627 1726882493.34271: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882493.35122: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882491.8511202-16992-254062204945562/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882493.35180: _low_level_execute_command(): starting 15627 1726882493.35199: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882491.8511202-16992-254062204945562/ > /dev/null 2>&1 && sleep 0' 15627 1726882493.36313: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882493.36327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882493.36345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882493.36367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882493.36408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882493.36420: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882493.36434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882493.36457: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882493.36474: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882493.36487: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882493.36500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882493.36513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882493.36529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882493.36542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882493.36559: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882493.36675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882493.36758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882493.36832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882493.36847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882493.37636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882493.38745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882493.38800: stderr chunk (state=3): >>><<< 15627 1726882493.38802: stdout chunk (state=3): >>><<< 15627 1726882493.38839: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882493.38842: handler run complete 15627 1726882493.38929: variable 'ansible_facts' from source: unknown 15627 1726882493.39031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882493.39284: variable 'ansible_facts' from source: unknown 15627 1726882493.39356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882493.39466: attempt loop complete, returning result 15627 1726882493.39469: _execute() done 15627 1726882493.39472: dumping result to json 15627 1726882493.39508: done dumping result, returning 15627 1726882493.39516: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-2847-7723-0000000003e7] 15627 1726882493.39521: sending task result for task 0e448fcc-3ce9-2847-7723-0000000003e7 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882493.40212: no more pending results, returning what we have 15627 1726882493.40215: results queue empty 15627 1726882493.40216: checking for any_errors_fatal 15627 1726882493.40220: done checking for any_errors_fatal 15627 1726882493.40221: checking for max_fail_percentage 15627 1726882493.40223: done checking for max_fail_percentage 15627 1726882493.40224: checking to see if all hosts have failed and the running result is not ok 15627 1726882493.40225: done checking to see if all hosts have failed 15627 1726882493.40226: getting the remaining hosts for this loop 15627 1726882493.40227: done getting the remaining hosts for this loop 15627 1726882493.40231: getting the next task for host managed_node1 15627 1726882493.40242: done getting next task for host managed_node1 15627 1726882493.40246: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15627 1726882493.40249: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882493.40260: getting variables 15627 1726882493.40261: in VariableManager get_vars() 15627 1726882493.40598: Calling all_inventory to load vars for managed_node1 15627 1726882493.40601: Calling groups_inventory to load vars for managed_node1 15627 1726882493.40603: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882493.40612: Calling all_plugins_play to load vars for managed_node1 15627 1726882493.40614: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882493.40617: Calling groups_plugins_play to load vars for managed_node1 15627 1726882493.41299: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000003e7 15627 1726882493.41303: WORKER PROCESS EXITING 15627 1726882493.42113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882493.43906: done with get_vars() 15627 1726882493.43935: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:34:53 -0400 (0:00:01.685) 0:00:33.191 ****** 15627 1726882493.44031: entering _queue_task() for managed_node1/package_facts 15627 1726882493.44333: worker is 1 (out of 1 available) 15627 1726882493.44347: exiting _queue_task() for managed_node1/package_facts 15627 1726882493.44362: done queuing things up, now waiting for results queue to drain 15627 1726882493.44366: waiting for pending results... 15627 1726882493.44677: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15627 1726882493.44818: in run() - task 0e448fcc-3ce9-2847-7723-0000000003e8 15627 1726882493.44839: variable 'ansible_search_path' from source: unknown 15627 1726882493.44847: variable 'ansible_search_path' from source: unknown 15627 1726882493.44893: calling self._execute() 15627 1726882493.45001: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882493.45013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882493.45031: variable 'omit' from source: magic vars 15627 1726882493.45433: variable 'ansible_distribution_major_version' from source: facts 15627 1726882493.45456: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882493.45472: variable 'omit' from source: magic vars 15627 1726882493.45527: variable 'omit' from source: magic vars 15627 1726882493.45573: variable 'omit' from source: magic vars 15627 1726882493.45617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882493.45662: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882493.45696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882493.45720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882493.45738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882493.45779: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882493.45792: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882493.45801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882493.45911: Set connection var ansible_timeout to 10 15627 1726882493.45925: Set connection var ansible_shell_executable to /bin/sh 15627 1726882493.45934: Set connection var ansible_connection to ssh 15627 1726882493.45942: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882493.45951: Set connection var ansible_pipelining to False 15627 1726882493.45961: Set connection var ansible_shell_type to sh 15627 1726882493.45991: variable 'ansible_shell_executable' from source: unknown 15627 1726882493.46000: variable 'ansible_connection' from source: unknown 15627 1726882493.46009: variable 'ansible_module_compression' from source: unknown 15627 1726882493.46016: variable 'ansible_shell_type' from source: unknown 15627 1726882493.46022: variable 'ansible_shell_executable' from source: unknown 15627 1726882493.46028: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882493.46035: variable 'ansible_pipelining' from source: unknown 15627 1726882493.46041: variable 'ansible_timeout' from source: unknown 15627 1726882493.46049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882493.46266: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882493.46284: variable 'omit' from source: magic vars 15627 1726882493.46296: starting attempt loop 15627 1726882493.46303: running the handler 15627 1726882493.46321: _low_level_execute_command(): starting 15627 1726882493.46338: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882493.47104: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882493.47119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882493.47134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882493.47152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882493.47200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882493.47218: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882493.47233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882493.47251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882493.47268: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882493.47280: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882493.47292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882493.47306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882493.47322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882493.47335: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882493.47346: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882493.47366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882493.47448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882493.47476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882493.47492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882493.47617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882493.49205: stdout chunk (state=3): >>>/root <<< 15627 1726882493.49311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882493.49427: stderr chunk (state=3): >>><<< 15627 1726882493.49431: stdout chunk (state=3): >>><<< 15627 1726882493.49538: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882493.49542: _low_level_execute_command(): starting 15627 1726882493.49545: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882493.4945037-17052-250230494041528 `" && echo ansible-tmp-1726882493.4945037-17052-250230494041528="` echo /root/.ansible/tmp/ansible-tmp-1726882493.4945037-17052-250230494041528 `" ) && sleep 0' 15627 1726882493.50317: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882493.50332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882493.50347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882493.50372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882493.50416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882493.50429: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882493.50444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882493.50468: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882493.50482: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882493.50493: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882493.50507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882493.50522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882493.50538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882493.50550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882493.50567: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882493.50582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882493.50658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882493.50679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882493.50694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882493.50880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882493.52688: stdout chunk (state=3): >>>ansible-tmp-1726882493.4945037-17052-250230494041528=/root/.ansible/tmp/ansible-tmp-1726882493.4945037-17052-250230494041528 <<< 15627 1726882493.52795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882493.52860: stderr chunk (state=3): >>><<< 15627 1726882493.52865: stdout chunk (state=3): >>><<< 15627 1726882493.53291: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882493.4945037-17052-250230494041528=/root/.ansible/tmp/ansible-tmp-1726882493.4945037-17052-250230494041528 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882493.53294: variable 'ansible_module_compression' from source: unknown 15627 1726882493.53296: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15627 1726882493.53298: variable 'ansible_facts' from source: unknown 15627 1726882493.53300: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882493.4945037-17052-250230494041528/AnsiballZ_package_facts.py 15627 1726882493.53362: Sending initial data 15627 1726882493.53367: Sent initial data (162 bytes) 15627 1726882493.54304: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882493.54319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882493.54334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882493.54352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882493.54397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882493.54410: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882493.54425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882493.54443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882493.54455: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882493.54471: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882493.54486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882493.54501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882493.54517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882493.54529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882493.54540: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882493.54554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882493.54630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882493.54648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882493.54665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882493.54800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882493.56522: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882493.56613: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882493.56707: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpz1cnfcwh /root/.ansible/tmp/ansible-tmp-1726882493.4945037-17052-250230494041528/AnsiballZ_package_facts.py <<< 15627 1726882493.56795: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882493.59589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882493.59845: stderr chunk (state=3): >>><<< 15627 1726882493.59848: stdout chunk (state=3): >>><<< 15627 1726882493.59850: done transferring module to remote 15627 1726882493.59859: _low_level_execute_command(): starting 15627 1726882493.59862: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882493.4945037-17052-250230494041528/ /root/.ansible/tmp/ansible-tmp-1726882493.4945037-17052-250230494041528/AnsiballZ_package_facts.py && sleep 0' 15627 1726882493.60440: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882493.60456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882493.60473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882493.60492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882493.60533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882493.60544: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882493.60561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882493.60582: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882493.60593: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882493.60605: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882493.60620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882493.60634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882493.60649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882493.60663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882493.60676: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882493.60688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882493.60771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882493.60788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882493.60801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882493.60928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882493.62656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882493.62729: stderr chunk (state=3): >>><<< 15627 1726882493.62732: stdout chunk (state=3): >>><<< 15627 1726882493.62821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882493.62824: _low_level_execute_command(): starting 15627 1726882493.62828: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882493.4945037-17052-250230494041528/AnsiballZ_package_facts.py && sleep 0' 15627 1726882493.63395: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882493.63409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882493.63424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882493.63442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882493.63491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882493.63504: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882493.63518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882493.63537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882493.63549: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882493.63567: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882493.63580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882493.63596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882493.63600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882493.63684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882493.63687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882493.63850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882494.10056: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 15627 1726882494.10183: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 15627 1726882494.10251: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 15627 1726882494.10258: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15627 1726882494.11859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882494.11865: stdout chunk (state=3): >>><<< 15627 1726882494.11867: stderr chunk (state=3): >>><<< 15627 1726882494.12176: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882494.14527: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882493.4945037-17052-250230494041528/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882494.14555: _low_level_execute_command(): starting 15627 1726882494.14567: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882493.4945037-17052-250230494041528/ > /dev/null 2>&1 && sleep 0' 15627 1726882494.15256: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882494.15273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882494.15287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882494.15304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882494.15348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882494.15366: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882494.15385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882494.15406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882494.15421: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882494.15435: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882494.15450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882494.15474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882494.15492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882494.15503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882494.15513: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882494.15524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882494.15607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882494.15631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882494.15647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882494.15775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882494.17616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882494.17698: stderr chunk (state=3): >>><<< 15627 1726882494.17709: stdout chunk (state=3): >>><<< 15627 1726882494.17772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882494.17776: handler run complete 15627 1726882494.18688: variable 'ansible_facts' from source: unknown 15627 1726882494.19231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.21575: variable 'ansible_facts' from source: unknown 15627 1726882494.22085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.22924: attempt loop complete, returning result 15627 1726882494.22953: _execute() done 15627 1726882494.22961: dumping result to json 15627 1726882494.23213: done dumping result, returning 15627 1726882494.23229: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-2847-7723-0000000003e8] 15627 1726882494.23239: sending task result for task 0e448fcc-3ce9-2847-7723-0000000003e8 15627 1726882494.25533: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000003e8 15627 1726882494.25536: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882494.25725: no more pending results, returning what we have 15627 1726882494.25728: results queue empty 15627 1726882494.25729: checking for any_errors_fatal 15627 1726882494.25734: done checking for any_errors_fatal 15627 1726882494.25735: checking for max_fail_percentage 15627 1726882494.25736: done checking for max_fail_percentage 15627 1726882494.25738: checking to see if all hosts have failed and the running result is not ok 15627 1726882494.25739: done checking to see if all hosts have failed 15627 1726882494.25740: getting the remaining hosts for this loop 15627 1726882494.25741: done getting the remaining hosts for this loop 15627 1726882494.25745: getting the next task for host managed_node1 15627 1726882494.25752: done getting next task for host managed_node1 15627 1726882494.25756: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15627 1726882494.25758: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882494.25770: getting variables 15627 1726882494.25771: in VariableManager get_vars() 15627 1726882494.25807: Calling all_inventory to load vars for managed_node1 15627 1726882494.25810: Calling groups_inventory to load vars for managed_node1 15627 1726882494.25813: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882494.25825: Calling all_plugins_play to load vars for managed_node1 15627 1726882494.25828: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882494.25831: Calling groups_plugins_play to load vars for managed_node1 15627 1726882494.27351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.29110: done with get_vars() 15627 1726882494.29134: done getting variables 15627 1726882494.29194: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:34:54 -0400 (0:00:00.851) 0:00:34.043 ****** 15627 1726882494.29234: entering _queue_task() for managed_node1/debug 15627 1726882494.29526: worker is 1 (out of 1 available) 15627 1726882494.29542: exiting _queue_task() for managed_node1/debug 15627 1726882494.29554: done queuing things up, now waiting for results queue to drain 15627 1726882494.29555: waiting for pending results... 15627 1726882494.29830: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 15627 1726882494.29949: in run() - task 0e448fcc-3ce9-2847-7723-00000000005b 15627 1726882494.29973: variable 'ansible_search_path' from source: unknown 15627 1726882494.29981: variable 'ansible_search_path' from source: unknown 15627 1726882494.30023: calling self._execute() 15627 1726882494.30126: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.30137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.30152: variable 'omit' from source: magic vars 15627 1726882494.30539: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.30555: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882494.30568: variable 'omit' from source: magic vars 15627 1726882494.30608: variable 'omit' from source: magic vars 15627 1726882494.30722: variable 'network_provider' from source: set_fact 15627 1726882494.30755: variable 'omit' from source: magic vars 15627 1726882494.30804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882494.30852: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882494.30888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882494.30911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882494.30929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882494.30974: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882494.30983: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.30990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.31101: Set connection var ansible_timeout to 10 15627 1726882494.31115: Set connection var ansible_shell_executable to /bin/sh 15627 1726882494.31125: Set connection var ansible_connection to ssh 15627 1726882494.31135: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882494.31144: Set connection var ansible_pipelining to False 15627 1726882494.31151: Set connection var ansible_shell_type to sh 15627 1726882494.31187: variable 'ansible_shell_executable' from source: unknown 15627 1726882494.31198: variable 'ansible_connection' from source: unknown 15627 1726882494.31206: variable 'ansible_module_compression' from source: unknown 15627 1726882494.31213: variable 'ansible_shell_type' from source: unknown 15627 1726882494.31220: variable 'ansible_shell_executable' from source: unknown 15627 1726882494.31226: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.31234: variable 'ansible_pipelining' from source: unknown 15627 1726882494.31240: variable 'ansible_timeout' from source: unknown 15627 1726882494.31248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.31408: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882494.31427: variable 'omit' from source: magic vars 15627 1726882494.31437: starting attempt loop 15627 1726882494.31444: running the handler 15627 1726882494.31501: handler run complete 15627 1726882494.31526: attempt loop complete, returning result 15627 1726882494.31534: _execute() done 15627 1726882494.31541: dumping result to json 15627 1726882494.31548: done dumping result, returning 15627 1726882494.31560: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-2847-7723-00000000005b] 15627 1726882494.31573: sending task result for task 0e448fcc-3ce9-2847-7723-00000000005b ok: [managed_node1] => {} MSG: Using network provider: nm 15627 1726882494.31733: no more pending results, returning what we have 15627 1726882494.31736: results queue empty 15627 1726882494.31737: checking for any_errors_fatal 15627 1726882494.31748: done checking for any_errors_fatal 15627 1726882494.31749: checking for max_fail_percentage 15627 1726882494.31751: done checking for max_fail_percentage 15627 1726882494.31752: checking to see if all hosts have failed and the running result is not ok 15627 1726882494.31753: done checking to see if all hosts have failed 15627 1726882494.31754: getting the remaining hosts for this loop 15627 1726882494.31756: done getting the remaining hosts for this loop 15627 1726882494.31760: getting the next task for host managed_node1 15627 1726882494.31770: done getting next task for host managed_node1 15627 1726882494.31775: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15627 1726882494.31778: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882494.31788: getting variables 15627 1726882494.31790: in VariableManager get_vars() 15627 1726882494.31831: Calling all_inventory to load vars for managed_node1 15627 1726882494.31834: Calling groups_inventory to load vars for managed_node1 15627 1726882494.31837: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882494.31848: Calling all_plugins_play to load vars for managed_node1 15627 1726882494.31852: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882494.31855: Calling groups_plugins_play to load vars for managed_node1 15627 1726882494.32812: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000005b 15627 1726882494.32815: WORKER PROCESS EXITING 15627 1726882494.33614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.35553: done with get_vars() 15627 1726882494.35578: done getting variables 15627 1726882494.35635: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:34:54 -0400 (0:00:00.064) 0:00:34.108 ****** 15627 1726882494.35669: entering _queue_task() for managed_node1/fail 15627 1726882494.35950: worker is 1 (out of 1 available) 15627 1726882494.35964: exiting _queue_task() for managed_node1/fail 15627 1726882494.35976: done queuing things up, now waiting for results queue to drain 15627 1726882494.35977: waiting for pending results... 15627 1726882494.36256: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15627 1726882494.36375: in run() - task 0e448fcc-3ce9-2847-7723-00000000005c 15627 1726882494.36395: variable 'ansible_search_path' from source: unknown 15627 1726882494.36403: variable 'ansible_search_path' from source: unknown 15627 1726882494.36451: calling self._execute() 15627 1726882494.36550: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.36561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.36576: variable 'omit' from source: magic vars 15627 1726882494.36982: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.37000: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882494.37131: variable 'network_state' from source: role '' defaults 15627 1726882494.37146: Evaluated conditional (network_state != {}): False 15627 1726882494.37154: when evaluation is False, skipping this task 15627 1726882494.37161: _execute() done 15627 1726882494.37170: dumping result to json 15627 1726882494.37184: done dumping result, returning 15627 1726882494.37196: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-2847-7723-00000000005c] 15627 1726882494.37207: sending task result for task 0e448fcc-3ce9-2847-7723-00000000005c skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882494.37354: no more pending results, returning what we have 15627 1726882494.37358: results queue empty 15627 1726882494.37359: checking for any_errors_fatal 15627 1726882494.37369: done checking for any_errors_fatal 15627 1726882494.37370: checking for max_fail_percentage 15627 1726882494.37372: done checking for max_fail_percentage 15627 1726882494.37373: checking to see if all hosts have failed and the running result is not ok 15627 1726882494.37374: done checking to see if all hosts have failed 15627 1726882494.37375: getting the remaining hosts for this loop 15627 1726882494.37377: done getting the remaining hosts for this loop 15627 1726882494.37381: getting the next task for host managed_node1 15627 1726882494.37389: done getting next task for host managed_node1 15627 1726882494.37393: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15627 1726882494.37396: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882494.37411: getting variables 15627 1726882494.37413: in VariableManager get_vars() 15627 1726882494.37451: Calling all_inventory to load vars for managed_node1 15627 1726882494.37454: Calling groups_inventory to load vars for managed_node1 15627 1726882494.37457: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882494.37477: Calling all_plugins_play to load vars for managed_node1 15627 1726882494.37481: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882494.37484: Calling groups_plugins_play to load vars for managed_node1 15627 1726882494.38483: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000005c 15627 1726882494.38487: WORKER PROCESS EXITING 15627 1726882494.39196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.40983: done with get_vars() 15627 1726882494.41008: done getting variables 15627 1726882494.41072: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:34:54 -0400 (0:00:00.054) 0:00:34.162 ****** 15627 1726882494.41103: entering _queue_task() for managed_node1/fail 15627 1726882494.41390: worker is 1 (out of 1 available) 15627 1726882494.41402: exiting _queue_task() for managed_node1/fail 15627 1726882494.41413: done queuing things up, now waiting for results queue to drain 15627 1726882494.41414: waiting for pending results... 15627 1726882494.41694: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15627 1726882494.41802: in run() - task 0e448fcc-3ce9-2847-7723-00000000005d 15627 1726882494.41825: variable 'ansible_search_path' from source: unknown 15627 1726882494.41832: variable 'ansible_search_path' from source: unknown 15627 1726882494.41882: calling self._execute() 15627 1726882494.41985: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.41996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.42010: variable 'omit' from source: magic vars 15627 1726882494.42404: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.42420: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882494.42532: variable 'network_state' from source: role '' defaults 15627 1726882494.42545: Evaluated conditional (network_state != {}): False 15627 1726882494.42551: when evaluation is False, skipping this task 15627 1726882494.42558: _execute() done 15627 1726882494.42568: dumping result to json 15627 1726882494.42575: done dumping result, returning 15627 1726882494.42586: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-2847-7723-00000000005d] 15627 1726882494.42596: sending task result for task 0e448fcc-3ce9-2847-7723-00000000005d 15627 1726882494.42715: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000005d skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882494.42770: no more pending results, returning what we have 15627 1726882494.42774: results queue empty 15627 1726882494.42775: checking for any_errors_fatal 15627 1726882494.42784: done checking for any_errors_fatal 15627 1726882494.42784: checking for max_fail_percentage 15627 1726882494.42786: done checking for max_fail_percentage 15627 1726882494.42788: checking to see if all hosts have failed and the running result is not ok 15627 1726882494.42789: done checking to see if all hosts have failed 15627 1726882494.42789: getting the remaining hosts for this loop 15627 1726882494.42791: done getting the remaining hosts for this loop 15627 1726882494.42795: getting the next task for host managed_node1 15627 1726882494.42804: done getting next task for host managed_node1 15627 1726882494.42808: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15627 1726882494.42812: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882494.42829: getting variables 15627 1726882494.42831: in VariableManager get_vars() 15627 1726882494.42878: Calling all_inventory to load vars for managed_node1 15627 1726882494.42882: Calling groups_inventory to load vars for managed_node1 15627 1726882494.42885: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882494.42898: Calling all_plugins_play to load vars for managed_node1 15627 1726882494.42902: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882494.42905: Calling groups_plugins_play to load vars for managed_node1 15627 1726882494.43929: WORKER PROCESS EXITING 15627 1726882494.44784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.46057: done with get_vars() 15627 1726882494.46075: done getting variables 15627 1726882494.46115: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:34:54 -0400 (0:00:00.050) 0:00:34.213 ****** 15627 1726882494.46137: entering _queue_task() for managed_node1/fail 15627 1726882494.46340: worker is 1 (out of 1 available) 15627 1726882494.46352: exiting _queue_task() for managed_node1/fail 15627 1726882494.46365: done queuing things up, now waiting for results queue to drain 15627 1726882494.46367: waiting for pending results... 15627 1726882494.46545: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15627 1726882494.46622: in run() - task 0e448fcc-3ce9-2847-7723-00000000005e 15627 1726882494.46632: variable 'ansible_search_path' from source: unknown 15627 1726882494.46635: variable 'ansible_search_path' from source: unknown 15627 1726882494.46670: calling self._execute() 15627 1726882494.46746: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.46751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.46761: variable 'omit' from source: magic vars 15627 1726882494.47031: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.47042: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882494.47168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882494.49245: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882494.49291: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882494.49318: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882494.49345: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882494.49367: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882494.49423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.49444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.49465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.49492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.49502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.49572: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.49584: Evaluated conditional (ansible_distribution_major_version | int > 9): False 15627 1726882494.49587: when evaluation is False, skipping this task 15627 1726882494.49590: _execute() done 15627 1726882494.49592: dumping result to json 15627 1726882494.49594: done dumping result, returning 15627 1726882494.49602: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-2847-7723-00000000005e] 15627 1726882494.49607: sending task result for task 0e448fcc-3ce9-2847-7723-00000000005e 15627 1726882494.49692: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000005e 15627 1726882494.49695: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 15627 1726882494.49739: no more pending results, returning what we have 15627 1726882494.49742: results queue empty 15627 1726882494.49743: checking for any_errors_fatal 15627 1726882494.49750: done checking for any_errors_fatal 15627 1726882494.49750: checking for max_fail_percentage 15627 1726882494.49752: done checking for max_fail_percentage 15627 1726882494.49753: checking to see if all hosts have failed and the running result is not ok 15627 1726882494.49757: done checking to see if all hosts have failed 15627 1726882494.49757: getting the remaining hosts for this loop 15627 1726882494.49759: done getting the remaining hosts for this loop 15627 1726882494.49762: getting the next task for host managed_node1 15627 1726882494.49770: done getting next task for host managed_node1 15627 1726882494.49774: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15627 1726882494.49776: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882494.49788: getting variables 15627 1726882494.49789: in VariableManager get_vars() 15627 1726882494.49822: Calling all_inventory to load vars for managed_node1 15627 1726882494.49825: Calling groups_inventory to load vars for managed_node1 15627 1726882494.49827: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882494.49836: Calling all_plugins_play to load vars for managed_node1 15627 1726882494.49838: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882494.49841: Calling groups_plugins_play to load vars for managed_node1 15627 1726882494.50739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.52552: done with get_vars() 15627 1726882494.52572: done getting variables 15627 1726882494.52612: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:34:54 -0400 (0:00:00.064) 0:00:34.277 ****** 15627 1726882494.52632: entering _queue_task() for managed_node1/dnf 15627 1726882494.52851: worker is 1 (out of 1 available) 15627 1726882494.52869: exiting _queue_task() for managed_node1/dnf 15627 1726882494.52881: done queuing things up, now waiting for results queue to drain 15627 1726882494.52882: waiting for pending results... 15627 1726882494.53050: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15627 1726882494.53126: in run() - task 0e448fcc-3ce9-2847-7723-00000000005f 15627 1726882494.53137: variable 'ansible_search_path' from source: unknown 15627 1726882494.53142: variable 'ansible_search_path' from source: unknown 15627 1726882494.53173: calling self._execute() 15627 1726882494.53243: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.53247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.53258: variable 'omit' from source: magic vars 15627 1726882494.53523: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.53531: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882494.53676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882494.55654: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882494.55707: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882494.55744: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882494.55773: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882494.55795: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882494.55875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.55907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.55937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.55987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.56006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.56121: variable 'ansible_distribution' from source: facts 15627 1726882494.56130: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.56149: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15627 1726882494.56275: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882494.56412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.56441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.56475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.56523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.56541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.56589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.56621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.56649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.56698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.56722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.56770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.56800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.56835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.56882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.56902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.57074: variable 'network_connections' from source: play vars 15627 1726882494.57090: variable 'profile' from source: play vars 15627 1726882494.57147: variable 'profile' from source: play vars 15627 1726882494.57156: variable 'interface' from source: set_fact 15627 1726882494.57216: variable 'interface' from source: set_fact 15627 1726882494.57275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882494.57403: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882494.57430: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882494.57452: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882494.57479: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882494.57509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882494.57525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882494.57547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.57569: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882494.57605: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882494.57761: variable 'network_connections' from source: play vars 15627 1726882494.57765: variable 'profile' from source: play vars 15627 1726882494.57811: variable 'profile' from source: play vars 15627 1726882494.57814: variable 'interface' from source: set_fact 15627 1726882494.57854: variable 'interface' from source: set_fact 15627 1726882494.57878: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15627 1726882494.57881: when evaluation is False, skipping this task 15627 1726882494.57884: _execute() done 15627 1726882494.57886: dumping result to json 15627 1726882494.57888: done dumping result, returning 15627 1726882494.57895: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-2847-7723-00000000005f] 15627 1726882494.57900: sending task result for task 0e448fcc-3ce9-2847-7723-00000000005f 15627 1726882494.57987: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000005f 15627 1726882494.57990: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15627 1726882494.58071: no more pending results, returning what we have 15627 1726882494.58075: results queue empty 15627 1726882494.58076: checking for any_errors_fatal 15627 1726882494.58082: done checking for any_errors_fatal 15627 1726882494.58083: checking for max_fail_percentage 15627 1726882494.58084: done checking for max_fail_percentage 15627 1726882494.58085: checking to see if all hosts have failed and the running result is not ok 15627 1726882494.58086: done checking to see if all hosts have failed 15627 1726882494.58087: getting the remaining hosts for this loop 15627 1726882494.58088: done getting the remaining hosts for this loop 15627 1726882494.58091: getting the next task for host managed_node1 15627 1726882494.58097: done getting next task for host managed_node1 15627 1726882494.58101: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15627 1726882494.58103: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882494.58114: getting variables 15627 1726882494.58116: in VariableManager get_vars() 15627 1726882494.58147: Calling all_inventory to load vars for managed_node1 15627 1726882494.58149: Calling groups_inventory to load vars for managed_node1 15627 1726882494.58151: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882494.58160: Calling all_plugins_play to load vars for managed_node1 15627 1726882494.58163: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882494.58167: Calling groups_plugins_play to load vars for managed_node1 15627 1726882494.59002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.60458: done with get_vars() 15627 1726882494.60477: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15627 1726882494.60528: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:34:54 -0400 (0:00:00.079) 0:00:34.357 ****** 15627 1726882494.60547: entering _queue_task() for managed_node1/yum 15627 1726882494.60756: worker is 1 (out of 1 available) 15627 1726882494.60771: exiting _queue_task() for managed_node1/yum 15627 1726882494.60783: done queuing things up, now waiting for results queue to drain 15627 1726882494.60784: waiting for pending results... 15627 1726882494.60957: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15627 1726882494.61043: in run() - task 0e448fcc-3ce9-2847-7723-000000000060 15627 1726882494.61053: variable 'ansible_search_path' from source: unknown 15627 1726882494.61056: variable 'ansible_search_path' from source: unknown 15627 1726882494.61091: calling self._execute() 15627 1726882494.61166: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.61170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.61178: variable 'omit' from source: magic vars 15627 1726882494.61441: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.61451: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882494.61575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882494.63738: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882494.63817: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882494.63843: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882494.63899: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882494.63938: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882494.64058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.64108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.64141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.64190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.64210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.64328: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.64365: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15627 1726882494.64381: when evaluation is False, skipping this task 15627 1726882494.64391: _execute() done 15627 1726882494.64406: dumping result to json 15627 1726882494.64418: done dumping result, returning 15627 1726882494.64438: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-2847-7723-000000000060] 15627 1726882494.64453: sending task result for task 0e448fcc-3ce9-2847-7723-000000000060 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15627 1726882494.64655: no more pending results, returning what we have 15627 1726882494.64670: results queue empty 15627 1726882494.64671: checking for any_errors_fatal 15627 1726882494.64684: done checking for any_errors_fatal 15627 1726882494.64685: checking for max_fail_percentage 15627 1726882494.64687: done checking for max_fail_percentage 15627 1726882494.64688: checking to see if all hosts have failed and the running result is not ok 15627 1726882494.64689: done checking to see if all hosts have failed 15627 1726882494.64689: getting the remaining hosts for this loop 15627 1726882494.64691: done getting the remaining hosts for this loop 15627 1726882494.64694: getting the next task for host managed_node1 15627 1726882494.64701: done getting next task for host managed_node1 15627 1726882494.64711: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15627 1726882494.64713: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882494.64722: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000060 15627 1726882494.64725: WORKER PROCESS EXITING 15627 1726882494.64732: getting variables 15627 1726882494.64733: in VariableManager get_vars() 15627 1726882494.64777: Calling all_inventory to load vars for managed_node1 15627 1726882494.64780: Calling groups_inventory to load vars for managed_node1 15627 1726882494.64782: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882494.64791: Calling all_plugins_play to load vars for managed_node1 15627 1726882494.64794: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882494.64796: Calling groups_plugins_play to load vars for managed_node1 15627 1726882494.70122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.72123: done with get_vars() 15627 1726882494.72151: done getting variables 15627 1726882494.72222: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:34:54 -0400 (0:00:00.117) 0:00:34.474 ****** 15627 1726882494.72260: entering _queue_task() for managed_node1/fail 15627 1726882494.72632: worker is 1 (out of 1 available) 15627 1726882494.72645: exiting _queue_task() for managed_node1/fail 15627 1726882494.72660: done queuing things up, now waiting for results queue to drain 15627 1726882494.72662: waiting for pending results... 15627 1726882494.72982: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15627 1726882494.73117: in run() - task 0e448fcc-3ce9-2847-7723-000000000061 15627 1726882494.73140: variable 'ansible_search_path' from source: unknown 15627 1726882494.73161: variable 'ansible_search_path' from source: unknown 15627 1726882494.73211: calling self._execute() 15627 1726882494.73314: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.73334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.73348: variable 'omit' from source: magic vars 15627 1726882494.73755: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.73780: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882494.73918: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882494.74084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882494.76075: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882494.76133: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882494.76172: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882494.76200: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882494.76220: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882494.76294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.76317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.76335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.76386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.76417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.76482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.76523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.76541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.76583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.76596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.76633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.76656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.76678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.76705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.76715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.76831: variable 'network_connections' from source: play vars 15627 1726882494.76841: variable 'profile' from source: play vars 15627 1726882494.76888: variable 'profile' from source: play vars 15627 1726882494.76891: variable 'interface' from source: set_fact 15627 1726882494.76936: variable 'interface' from source: set_fact 15627 1726882494.76988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882494.77108: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882494.77135: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882494.77158: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882494.77185: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882494.77214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882494.77230: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882494.77248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.77269: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882494.77308: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882494.77461: variable 'network_connections' from source: play vars 15627 1726882494.77467: variable 'profile' from source: play vars 15627 1726882494.77511: variable 'profile' from source: play vars 15627 1726882494.77515: variable 'interface' from source: set_fact 15627 1726882494.77557: variable 'interface' from source: set_fact 15627 1726882494.77580: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15627 1726882494.77583: when evaluation is False, skipping this task 15627 1726882494.77586: _execute() done 15627 1726882494.77588: dumping result to json 15627 1726882494.77592: done dumping result, returning 15627 1726882494.77597: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-2847-7723-000000000061] 15627 1726882494.77609: sending task result for task 0e448fcc-3ce9-2847-7723-000000000061 15627 1726882494.77693: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000061 15627 1726882494.77696: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15627 1726882494.77746: no more pending results, returning what we have 15627 1726882494.77750: results queue empty 15627 1726882494.77751: checking for any_errors_fatal 15627 1726882494.77760: done checking for any_errors_fatal 15627 1726882494.77761: checking for max_fail_percentage 15627 1726882494.77762: done checking for max_fail_percentage 15627 1726882494.77764: checking to see if all hosts have failed and the running result is not ok 15627 1726882494.77765: done checking to see if all hosts have failed 15627 1726882494.77766: getting the remaining hosts for this loop 15627 1726882494.77768: done getting the remaining hosts for this loop 15627 1726882494.77772: getting the next task for host managed_node1 15627 1726882494.77780: done getting next task for host managed_node1 15627 1726882494.77784: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15627 1726882494.77786: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882494.77797: getting variables 15627 1726882494.77799: in VariableManager get_vars() 15627 1726882494.77834: Calling all_inventory to load vars for managed_node1 15627 1726882494.77837: Calling groups_inventory to load vars for managed_node1 15627 1726882494.77839: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882494.77848: Calling all_plugins_play to load vars for managed_node1 15627 1726882494.77850: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882494.77852: Calling groups_plugins_play to load vars for managed_node1 15627 1726882494.78687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.79643: done with get_vars() 15627 1726882494.79662: done getting variables 15627 1726882494.79705: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:34:54 -0400 (0:00:00.074) 0:00:34.548 ****** 15627 1726882494.79727: entering _queue_task() for managed_node1/package 15627 1726882494.79929: worker is 1 (out of 1 available) 15627 1726882494.79942: exiting _queue_task() for managed_node1/package 15627 1726882494.79954: done queuing things up, now waiting for results queue to drain 15627 1726882494.79956: waiting for pending results... 15627 1726882494.80130: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 15627 1726882494.80212: in run() - task 0e448fcc-3ce9-2847-7723-000000000062 15627 1726882494.80229: variable 'ansible_search_path' from source: unknown 15627 1726882494.80233: variable 'ansible_search_path' from source: unknown 15627 1726882494.80265: calling self._execute() 15627 1726882494.80339: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.80343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.80351: variable 'omit' from source: magic vars 15627 1726882494.80621: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.80630: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882494.80767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882494.80949: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882494.80986: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882494.81011: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882494.81050: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882494.81133: variable 'network_packages' from source: role '' defaults 15627 1726882494.81210: variable '__network_provider_setup' from source: role '' defaults 15627 1726882494.81219: variable '__network_service_name_default_nm' from source: role '' defaults 15627 1726882494.81265: variable '__network_service_name_default_nm' from source: role '' defaults 15627 1726882494.81275: variable '__network_packages_default_nm' from source: role '' defaults 15627 1726882494.81317: variable '__network_packages_default_nm' from source: role '' defaults 15627 1726882494.81439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882494.83031: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882494.83075: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882494.83101: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882494.83124: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882494.83145: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882494.83201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.83219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.83238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.83268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.83281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.83310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.83325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.83343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.83371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.83382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.83533: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15627 1726882494.83606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.83622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.83639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.83669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.83680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.83741: variable 'ansible_python' from source: facts 15627 1726882494.83761: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15627 1726882494.83818: variable '__network_wpa_supplicant_required' from source: role '' defaults 15627 1726882494.83876: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15627 1726882494.83957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.83977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.83995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.84022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.84031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.84067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882494.84086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882494.84104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.84130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882494.84144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882494.84243: variable 'network_connections' from source: play vars 15627 1726882494.84250: variable 'profile' from source: play vars 15627 1726882494.84325: variable 'profile' from source: play vars 15627 1726882494.84330: variable 'interface' from source: set_fact 15627 1726882494.84384: variable 'interface' from source: set_fact 15627 1726882494.84432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882494.84455: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882494.84478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882494.84499: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882494.84533: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882494.84720: variable 'network_connections' from source: play vars 15627 1726882494.84723: variable 'profile' from source: play vars 15627 1726882494.84797: variable 'profile' from source: play vars 15627 1726882494.84803: variable 'interface' from source: set_fact 15627 1726882494.84850: variable 'interface' from source: set_fact 15627 1726882494.84878: variable '__network_packages_default_wireless' from source: role '' defaults 15627 1726882494.84932: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882494.85130: variable 'network_connections' from source: play vars 15627 1726882494.85133: variable 'profile' from source: play vars 15627 1726882494.85180: variable 'profile' from source: play vars 15627 1726882494.85185: variable 'interface' from source: set_fact 15627 1726882494.85256: variable 'interface' from source: set_fact 15627 1726882494.85274: variable '__network_packages_default_team' from source: role '' defaults 15627 1726882494.85333: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882494.85524: variable 'network_connections' from source: play vars 15627 1726882494.85528: variable 'profile' from source: play vars 15627 1726882494.85576: variable 'profile' from source: play vars 15627 1726882494.85579: variable 'interface' from source: set_fact 15627 1726882494.85648: variable 'interface' from source: set_fact 15627 1726882494.85690: variable '__network_service_name_default_initscripts' from source: role '' defaults 15627 1726882494.85731: variable '__network_service_name_default_initscripts' from source: role '' defaults 15627 1726882494.85735: variable '__network_packages_default_initscripts' from source: role '' defaults 15627 1726882494.85783: variable '__network_packages_default_initscripts' from source: role '' defaults 15627 1726882494.85932: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15627 1726882494.86227: variable 'network_connections' from source: play vars 15627 1726882494.86230: variable 'profile' from source: play vars 15627 1726882494.86276: variable 'profile' from source: play vars 15627 1726882494.86280: variable 'interface' from source: set_fact 15627 1726882494.86326: variable 'interface' from source: set_fact 15627 1726882494.86332: variable 'ansible_distribution' from source: facts 15627 1726882494.86335: variable '__network_rh_distros' from source: role '' defaults 15627 1726882494.86340: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.86351: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15627 1726882494.86460: variable 'ansible_distribution' from source: facts 15627 1726882494.86465: variable '__network_rh_distros' from source: role '' defaults 15627 1726882494.86468: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.86478: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15627 1726882494.86584: variable 'ansible_distribution' from source: facts 15627 1726882494.86588: variable '__network_rh_distros' from source: role '' defaults 15627 1726882494.86591: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.86617: variable 'network_provider' from source: set_fact 15627 1726882494.86634: variable 'ansible_facts' from source: unknown 15627 1726882494.87023: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15627 1726882494.87026: when evaluation is False, skipping this task 15627 1726882494.87029: _execute() done 15627 1726882494.87032: dumping result to json 15627 1726882494.87034: done dumping result, returning 15627 1726882494.87041: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-2847-7723-000000000062] 15627 1726882494.87046: sending task result for task 0e448fcc-3ce9-2847-7723-000000000062 15627 1726882494.87389: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000062 15627 1726882494.87393: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15627 1726882494.87433: no more pending results, returning what we have 15627 1726882494.87436: results queue empty 15627 1726882494.87437: checking for any_errors_fatal 15627 1726882494.87442: done checking for any_errors_fatal 15627 1726882494.87443: checking for max_fail_percentage 15627 1726882494.87445: done checking for max_fail_percentage 15627 1726882494.87445: checking to see if all hosts have failed and the running result is not ok 15627 1726882494.87446: done checking to see if all hosts have failed 15627 1726882494.87447: getting the remaining hosts for this loop 15627 1726882494.87448: done getting the remaining hosts for this loop 15627 1726882494.87452: getting the next task for host managed_node1 15627 1726882494.87458: done getting next task for host managed_node1 15627 1726882494.87462: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15627 1726882494.87467: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882494.87480: getting variables 15627 1726882494.87481: in VariableManager get_vars() 15627 1726882494.87516: Calling all_inventory to load vars for managed_node1 15627 1726882494.87519: Calling groups_inventory to load vars for managed_node1 15627 1726882494.87521: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882494.87539: Calling all_plugins_play to load vars for managed_node1 15627 1726882494.87542: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882494.87544: Calling groups_plugins_play to load vars for managed_node1 15627 1726882494.88748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.89694: done with get_vars() 15627 1726882494.89710: done getting variables 15627 1726882494.89751: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:34:54 -0400 (0:00:00.100) 0:00:34.649 ****** 15627 1726882494.89776: entering _queue_task() for managed_node1/package 15627 1726882494.89980: worker is 1 (out of 1 available) 15627 1726882494.89992: exiting _queue_task() for managed_node1/package 15627 1726882494.90005: done queuing things up, now waiting for results queue to drain 15627 1726882494.90007: waiting for pending results... 15627 1726882494.90212: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15627 1726882494.90408: in run() - task 0e448fcc-3ce9-2847-7723-000000000063 15627 1726882494.90424: variable 'ansible_search_path' from source: unknown 15627 1726882494.90430: variable 'ansible_search_path' from source: unknown 15627 1726882494.90468: calling self._execute() 15627 1726882494.90560: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.90574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.90592: variable 'omit' from source: magic vars 15627 1726882494.90962: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.90982: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882494.91112: variable 'network_state' from source: role '' defaults 15627 1726882494.91127: Evaluated conditional (network_state != {}): False 15627 1726882494.91139: when evaluation is False, skipping this task 15627 1726882494.91145: _execute() done 15627 1726882494.91150: dumping result to json 15627 1726882494.91155: done dumping result, returning 15627 1726882494.91168: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-2847-7723-000000000063] 15627 1726882494.91177: sending task result for task 0e448fcc-3ce9-2847-7723-000000000063 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882494.91327: no more pending results, returning what we have 15627 1726882494.91330: results queue empty 15627 1726882494.91331: checking for any_errors_fatal 15627 1726882494.91337: done checking for any_errors_fatal 15627 1726882494.91338: checking for max_fail_percentage 15627 1726882494.91340: done checking for max_fail_percentage 15627 1726882494.91341: checking to see if all hosts have failed and the running result is not ok 15627 1726882494.91342: done checking to see if all hosts have failed 15627 1726882494.91343: getting the remaining hosts for this loop 15627 1726882494.91344: done getting the remaining hosts for this loop 15627 1726882494.91349: getting the next task for host managed_node1 15627 1726882494.91357: done getting next task for host managed_node1 15627 1726882494.91361: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15627 1726882494.91364: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882494.91379: getting variables 15627 1726882494.91381: in VariableManager get_vars() 15627 1726882494.91421: Calling all_inventory to load vars for managed_node1 15627 1726882494.91424: Calling groups_inventory to load vars for managed_node1 15627 1726882494.91427: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882494.91440: Calling all_plugins_play to load vars for managed_node1 15627 1726882494.91444: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882494.91447: Calling groups_plugins_play to load vars for managed_node1 15627 1726882494.92505: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000063 15627 1726882494.92509: WORKER PROCESS EXITING 15627 1726882494.93157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.94392: done with get_vars() 15627 1726882494.94408: done getting variables 15627 1726882494.94451: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:34:54 -0400 (0:00:00.046) 0:00:34.696 ****** 15627 1726882494.94475: entering _queue_task() for managed_node1/package 15627 1726882494.94673: worker is 1 (out of 1 available) 15627 1726882494.94685: exiting _queue_task() for managed_node1/package 15627 1726882494.94697: done queuing things up, now waiting for results queue to drain 15627 1726882494.94698: waiting for pending results... 15627 1726882494.94880: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15627 1726882494.94952: in run() - task 0e448fcc-3ce9-2847-7723-000000000064 15627 1726882494.94967: variable 'ansible_search_path' from source: unknown 15627 1726882494.94970: variable 'ansible_search_path' from source: unknown 15627 1726882494.95001: calling self._execute() 15627 1726882494.95080: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.95084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.95092: variable 'omit' from source: magic vars 15627 1726882494.95395: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.95412: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882494.95534: variable 'network_state' from source: role '' defaults 15627 1726882494.95551: Evaluated conditional (network_state != {}): False 15627 1726882494.95567: when evaluation is False, skipping this task 15627 1726882494.95572: _execute() done 15627 1726882494.95575: dumping result to json 15627 1726882494.95578: done dumping result, returning 15627 1726882494.95581: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-2847-7723-000000000064] 15627 1726882494.95584: sending task result for task 0e448fcc-3ce9-2847-7723-000000000064 15627 1726882494.95687: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000064 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882494.95736: no more pending results, returning what we have 15627 1726882494.95739: results queue empty 15627 1726882494.95740: checking for any_errors_fatal 15627 1726882494.95744: done checking for any_errors_fatal 15627 1726882494.95745: checking for max_fail_percentage 15627 1726882494.95746: done checking for max_fail_percentage 15627 1726882494.95747: checking to see if all hosts have failed and the running result is not ok 15627 1726882494.95748: done checking to see if all hosts have failed 15627 1726882494.95748: getting the remaining hosts for this loop 15627 1726882494.95750: done getting the remaining hosts for this loop 15627 1726882494.95753: getting the next task for host managed_node1 15627 1726882494.95762: done getting next task for host managed_node1 15627 1726882494.95768: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15627 1726882494.95770: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882494.95793: getting variables 15627 1726882494.95794: in VariableManager get_vars() 15627 1726882494.95829: Calling all_inventory to load vars for managed_node1 15627 1726882494.95832: Calling groups_inventory to load vars for managed_node1 15627 1726882494.95834: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882494.95845: Calling all_plugins_play to load vars for managed_node1 15627 1726882494.95848: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882494.95851: Calling groups_plugins_play to load vars for managed_node1 15627 1726882494.96371: WORKER PROCESS EXITING 15627 1726882494.96960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882494.97914: done with get_vars() 15627 1726882494.97928: done getting variables 15627 1726882494.97972: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:34:54 -0400 (0:00:00.035) 0:00:34.731 ****** 15627 1726882494.97994: entering _queue_task() for managed_node1/service 15627 1726882494.98168: worker is 1 (out of 1 available) 15627 1726882494.98180: exiting _queue_task() for managed_node1/service 15627 1726882494.98192: done queuing things up, now waiting for results queue to drain 15627 1726882494.98193: waiting for pending results... 15627 1726882494.98370: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15627 1726882494.98444: in run() - task 0e448fcc-3ce9-2847-7723-000000000065 15627 1726882494.98455: variable 'ansible_search_path' from source: unknown 15627 1726882494.98461: variable 'ansible_search_path' from source: unknown 15627 1726882494.98493: calling self._execute() 15627 1726882494.98576: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882494.98580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882494.98589: variable 'omit' from source: magic vars 15627 1726882494.98977: variable 'ansible_distribution_major_version' from source: facts 15627 1726882494.98998: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882494.99127: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882494.99338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882495.01408: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882495.01449: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882495.01493: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882495.01517: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882495.01537: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882495.01599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882495.01618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882495.01636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.01666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882495.01677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882495.01710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882495.01727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882495.01743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.01771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882495.01783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882495.01812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882495.01828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882495.01844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.01873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882495.01883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882495.02026: variable 'network_connections' from source: play vars 15627 1726882495.02031: variable 'profile' from source: play vars 15627 1726882495.02062: variable 'profile' from source: play vars 15627 1726882495.02067: variable 'interface' from source: set_fact 15627 1726882495.02120: variable 'interface' from source: set_fact 15627 1726882495.02174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882495.02310: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882495.02333: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882495.02358: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882495.02952: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882495.02956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882495.02958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882495.02961: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.02964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882495.02967: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882495.02969: variable 'network_connections' from source: play vars 15627 1726882495.02971: variable 'profile' from source: play vars 15627 1726882495.02974: variable 'profile' from source: play vars 15627 1726882495.02976: variable 'interface' from source: set_fact 15627 1726882495.02978: variable 'interface' from source: set_fact 15627 1726882495.02980: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15627 1726882495.02982: when evaluation is False, skipping this task 15627 1726882495.02987: _execute() done 15627 1726882495.02989: dumping result to json 15627 1726882495.02991: done dumping result, returning 15627 1726882495.02993: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-2847-7723-000000000065] 15627 1726882495.03003: sending task result for task 0e448fcc-3ce9-2847-7723-000000000065 15627 1726882495.03068: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000065 15627 1726882495.03071: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15627 1726882495.03334: no more pending results, returning what we have 15627 1726882495.03337: results queue empty 15627 1726882495.03338: checking for any_errors_fatal 15627 1726882495.03343: done checking for any_errors_fatal 15627 1726882495.03344: checking for max_fail_percentage 15627 1726882495.03348: done checking for max_fail_percentage 15627 1726882495.03348: checking to see if all hosts have failed and the running result is not ok 15627 1726882495.03349: done checking to see if all hosts have failed 15627 1726882495.03349: getting the remaining hosts for this loop 15627 1726882495.03350: done getting the remaining hosts for this loop 15627 1726882495.03356: getting the next task for host managed_node1 15627 1726882495.03361: done getting next task for host managed_node1 15627 1726882495.03367: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15627 1726882495.03369: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882495.03378: getting variables 15627 1726882495.03380: in VariableManager get_vars() 15627 1726882495.03408: Calling all_inventory to load vars for managed_node1 15627 1726882495.03410: Calling groups_inventory to load vars for managed_node1 15627 1726882495.03412: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882495.03418: Calling all_plugins_play to load vars for managed_node1 15627 1726882495.03420: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882495.03422: Calling groups_plugins_play to load vars for managed_node1 15627 1726882495.04789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882495.05749: done with get_vars() 15627 1726882495.05768: done getting variables 15627 1726882495.05810: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:34:55 -0400 (0:00:00.078) 0:00:34.809 ****** 15627 1726882495.05831: entering _queue_task() for managed_node1/service 15627 1726882495.06041: worker is 1 (out of 1 available) 15627 1726882495.06058: exiting _queue_task() for managed_node1/service 15627 1726882495.06071: done queuing things up, now waiting for results queue to drain 15627 1726882495.06073: waiting for pending results... 15627 1726882495.06242: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15627 1726882495.06318: in run() - task 0e448fcc-3ce9-2847-7723-000000000066 15627 1726882495.06327: variable 'ansible_search_path' from source: unknown 15627 1726882495.06330: variable 'ansible_search_path' from source: unknown 15627 1726882495.06360: calling self._execute() 15627 1726882495.06439: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882495.06446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882495.06457: variable 'omit' from source: magic vars 15627 1726882495.06836: variable 'ansible_distribution_major_version' from source: facts 15627 1726882495.06871: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882495.07041: variable 'network_provider' from source: set_fact 15627 1726882495.07051: variable 'network_state' from source: role '' defaults 15627 1726882495.07073: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15627 1726882495.07093: variable 'omit' from source: magic vars 15627 1726882495.07131: variable 'omit' from source: magic vars 15627 1726882495.07166: variable 'network_service_name' from source: role '' defaults 15627 1726882495.07245: variable 'network_service_name' from source: role '' defaults 15627 1726882495.07375: variable '__network_provider_setup' from source: role '' defaults 15627 1726882495.07389: variable '__network_service_name_default_nm' from source: role '' defaults 15627 1726882495.07455: variable '__network_service_name_default_nm' from source: role '' defaults 15627 1726882495.07467: variable '__network_packages_default_nm' from source: role '' defaults 15627 1726882495.07530: variable '__network_packages_default_nm' from source: role '' defaults 15627 1726882495.07690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882495.09202: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882495.09252: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882495.09287: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882495.09312: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882495.09332: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882495.09395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882495.09414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882495.09432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.09460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882495.09477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882495.09511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882495.09527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882495.09543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.09573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882495.09586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882495.09736: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15627 1726882495.09811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882495.09830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882495.09846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.09875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882495.09885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882495.09951: variable 'ansible_python' from source: facts 15627 1726882495.09970: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15627 1726882495.10024: variable '__network_wpa_supplicant_required' from source: role '' defaults 15627 1726882495.10083: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15627 1726882495.10168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882495.10185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882495.10201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.10225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882495.10237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882495.10274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882495.10293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882495.10309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.10333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882495.10344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882495.10436: variable 'network_connections' from source: play vars 15627 1726882495.10441: variable 'profile' from source: play vars 15627 1726882495.10497: variable 'profile' from source: play vars 15627 1726882495.10501: variable 'interface' from source: set_fact 15627 1726882495.10542: variable 'interface' from source: set_fact 15627 1726882495.10614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882495.10742: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882495.10781: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882495.10812: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882495.10841: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882495.10888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882495.10909: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882495.10931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.10953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882495.10990: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882495.11169: variable 'network_connections' from source: play vars 15627 1726882495.11175: variable 'profile' from source: play vars 15627 1726882495.11229: variable 'profile' from source: play vars 15627 1726882495.11234: variable 'interface' from source: set_fact 15627 1726882495.11277: variable 'interface' from source: set_fact 15627 1726882495.11301: variable '__network_packages_default_wireless' from source: role '' defaults 15627 1726882495.11358: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882495.11541: variable 'network_connections' from source: play vars 15627 1726882495.11549: variable 'profile' from source: play vars 15627 1726882495.11599: variable 'profile' from source: play vars 15627 1726882495.11602: variable 'interface' from source: set_fact 15627 1726882495.11659: variable 'interface' from source: set_fact 15627 1726882495.11676: variable '__network_packages_default_team' from source: role '' defaults 15627 1726882495.11728: variable '__network_team_connections_defined' from source: role '' defaults 15627 1726882495.11919: variable 'network_connections' from source: play vars 15627 1726882495.11922: variable 'profile' from source: play vars 15627 1726882495.11972: variable 'profile' from source: play vars 15627 1726882495.11980: variable 'interface' from source: set_fact 15627 1726882495.12029: variable 'interface' from source: set_fact 15627 1726882495.12070: variable '__network_service_name_default_initscripts' from source: role '' defaults 15627 1726882495.12113: variable '__network_service_name_default_initscripts' from source: role '' defaults 15627 1726882495.12119: variable '__network_packages_default_initscripts' from source: role '' defaults 15627 1726882495.12161: variable '__network_packages_default_initscripts' from source: role '' defaults 15627 1726882495.12297: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15627 1726882495.12731: variable 'network_connections' from source: play vars 15627 1726882495.12739: variable 'profile' from source: play vars 15627 1726882495.12782: variable 'profile' from source: play vars 15627 1726882495.12785: variable 'interface' from source: set_fact 15627 1726882495.12832: variable 'interface' from source: set_fact 15627 1726882495.12838: variable 'ansible_distribution' from source: facts 15627 1726882495.12845: variable '__network_rh_distros' from source: role '' defaults 15627 1726882495.12848: variable 'ansible_distribution_major_version' from source: facts 15627 1726882495.12862: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15627 1726882495.12969: variable 'ansible_distribution' from source: facts 15627 1726882495.12973: variable '__network_rh_distros' from source: role '' defaults 15627 1726882495.12981: variable 'ansible_distribution_major_version' from source: facts 15627 1726882495.12988: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15627 1726882495.13099: variable 'ansible_distribution' from source: facts 15627 1726882495.13102: variable '__network_rh_distros' from source: role '' defaults 15627 1726882495.13107: variable 'ansible_distribution_major_version' from source: facts 15627 1726882495.13131: variable 'network_provider' from source: set_fact 15627 1726882495.13147: variable 'omit' from source: magic vars 15627 1726882495.13171: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882495.13190: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882495.13205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882495.13217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882495.13226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882495.13248: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882495.13251: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882495.13256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882495.13323: Set connection var ansible_timeout to 10 15627 1726882495.13330: Set connection var ansible_shell_executable to /bin/sh 15627 1726882495.13335: Set connection var ansible_connection to ssh 15627 1726882495.13340: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882495.13345: Set connection var ansible_pipelining to False 15627 1726882495.13347: Set connection var ansible_shell_type to sh 15627 1726882495.13368: variable 'ansible_shell_executable' from source: unknown 15627 1726882495.13371: variable 'ansible_connection' from source: unknown 15627 1726882495.13374: variable 'ansible_module_compression' from source: unknown 15627 1726882495.13376: variable 'ansible_shell_type' from source: unknown 15627 1726882495.13378: variable 'ansible_shell_executable' from source: unknown 15627 1726882495.13380: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882495.13391: variable 'ansible_pipelining' from source: unknown 15627 1726882495.13394: variable 'ansible_timeout' from source: unknown 15627 1726882495.13400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882495.13460: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882495.13470: variable 'omit' from source: magic vars 15627 1726882495.13474: starting attempt loop 15627 1726882495.13477: running the handler 15627 1726882495.13533: variable 'ansible_facts' from source: unknown 15627 1726882495.13990: _low_level_execute_command(): starting 15627 1726882495.13996: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882495.14508: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882495.14522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.14540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882495.14552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.14567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.14610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882495.14621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882495.14733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882495.16400: stdout chunk (state=3): >>>/root <<< 15627 1726882495.16501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882495.16559: stderr chunk (state=3): >>><<< 15627 1726882495.16563: stdout chunk (state=3): >>><<< 15627 1726882495.16580: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882495.16589: _low_level_execute_command(): starting 15627 1726882495.16595: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882495.1657941-17117-127153851427496 `" && echo ansible-tmp-1726882495.1657941-17117-127153851427496="` echo /root/.ansible/tmp/ansible-tmp-1726882495.1657941-17117-127153851427496 `" ) && sleep 0' 15627 1726882495.17040: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882495.17059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.17079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.17090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.17138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882495.17149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882495.17249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882495.19095: stdout chunk (state=3): >>>ansible-tmp-1726882495.1657941-17117-127153851427496=/root/.ansible/tmp/ansible-tmp-1726882495.1657941-17117-127153851427496 <<< 15627 1726882495.19210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882495.19254: stderr chunk (state=3): >>><<< 15627 1726882495.19257: stdout chunk (state=3): >>><<< 15627 1726882495.19273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882495.1657941-17117-127153851427496=/root/.ansible/tmp/ansible-tmp-1726882495.1657941-17117-127153851427496 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882495.19297: variable 'ansible_module_compression' from source: unknown 15627 1726882495.19341: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15627 1726882495.19394: variable 'ansible_facts' from source: unknown 15627 1726882495.19530: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882495.1657941-17117-127153851427496/AnsiballZ_systemd.py 15627 1726882495.19636: Sending initial data 15627 1726882495.19645: Sent initial data (156 bytes) 15627 1726882495.20307: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882495.20313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.20359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.20362: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882495.20367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.20418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882495.20423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882495.20524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882495.22218: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 15627 1726882495.22228: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882495.22313: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882495.22405: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpy77d361g /root/.ansible/tmp/ansible-tmp-1726882495.1657941-17117-127153851427496/AnsiballZ_systemd.py <<< 15627 1726882495.22496: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882495.24803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882495.24893: stderr chunk (state=3): >>><<< 15627 1726882495.24897: stdout chunk (state=3): >>><<< 15627 1726882495.24911: done transferring module to remote 15627 1726882495.24919: _low_level_execute_command(): starting 15627 1726882495.24924: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882495.1657941-17117-127153851427496/ /root/.ansible/tmp/ansible-tmp-1726882495.1657941-17117-127153851427496/AnsiballZ_systemd.py && sleep 0' 15627 1726882495.25353: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882495.25359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.25388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15627 1726882495.25392: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.25444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882495.25447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882495.25546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882495.27278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882495.27341: stderr chunk (state=3): >>><<< 15627 1726882495.27352: stdout chunk (state=3): >>><<< 15627 1726882495.27441: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882495.27444: _low_level_execute_command(): starting 15627 1726882495.27447: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882495.1657941-17117-127153851427496/AnsiballZ_systemd.py && sleep 0' 15627 1726882495.27981: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882495.27998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882495.28012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882495.28028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.28071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882495.28085: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882495.28106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.28124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882495.28127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.28200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882495.28207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882495.28296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882495.53435: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 15627 1726882495.53459: stdout chunk (state=3): >>>service", "ControlGroupId": "2455", "MemoryCurrent": "16097280", "MemoryAvailable": "infinity", "CPUUsageNSec": "821044000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSi<<< 15627 1726882495.53472: stdout chunk (state=3): >>>gnal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15627 1726882495.55000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882495.55058: stderr chunk (state=3): >>><<< 15627 1726882495.55062: stdout chunk (state=3): >>><<< 15627 1726882495.55081: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16097280", "MemoryAvailable": "infinity", "CPUUsageNSec": "821044000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882495.55192: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882495.1657941-17117-127153851427496/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882495.55208: _low_level_execute_command(): starting 15627 1726882495.55212: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882495.1657941-17117-127153851427496/ > /dev/null 2>&1 && sleep 0' 15627 1726882495.55671: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882495.55684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.55704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882495.55716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882495.55725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.55775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882495.55791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882495.55885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882495.57720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882495.57766: stderr chunk (state=3): >>><<< 15627 1726882495.57770: stdout chunk (state=3): >>><<< 15627 1726882495.57780: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882495.57788: handler run complete 15627 1726882495.57824: attempt loop complete, returning result 15627 1726882495.57827: _execute() done 15627 1726882495.57830: dumping result to json 15627 1726882495.57840: done dumping result, returning 15627 1726882495.57849: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-2847-7723-000000000066] 15627 1726882495.57853: sending task result for task 0e448fcc-3ce9-2847-7723-000000000066 15627 1726882495.58075: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000066 15627 1726882495.58078: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882495.58126: no more pending results, returning what we have 15627 1726882495.58129: results queue empty 15627 1726882495.58130: checking for any_errors_fatal 15627 1726882495.58135: done checking for any_errors_fatal 15627 1726882495.58136: checking for max_fail_percentage 15627 1726882495.58138: done checking for max_fail_percentage 15627 1726882495.58138: checking to see if all hosts have failed and the running result is not ok 15627 1726882495.58139: done checking to see if all hosts have failed 15627 1726882495.58140: getting the remaining hosts for this loop 15627 1726882495.58142: done getting the remaining hosts for this loop 15627 1726882495.58145: getting the next task for host managed_node1 15627 1726882495.58151: done getting next task for host managed_node1 15627 1726882495.58157: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15627 1726882495.58159: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882495.58170: getting variables 15627 1726882495.58171: in VariableManager get_vars() 15627 1726882495.58203: Calling all_inventory to load vars for managed_node1 15627 1726882495.58206: Calling groups_inventory to load vars for managed_node1 15627 1726882495.58208: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882495.58217: Calling all_plugins_play to load vars for managed_node1 15627 1726882495.58219: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882495.58221: Calling groups_plugins_play to load vars for managed_node1 15627 1726882495.59174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882495.60118: done with get_vars() 15627 1726882495.60133: done getting variables 15627 1726882495.60181: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:34:55 -0400 (0:00:00.543) 0:00:35.353 ****** 15627 1726882495.60202: entering _queue_task() for managed_node1/service 15627 1726882495.60414: worker is 1 (out of 1 available) 15627 1726882495.60426: exiting _queue_task() for managed_node1/service 15627 1726882495.60437: done queuing things up, now waiting for results queue to drain 15627 1726882495.60439: waiting for pending results... 15627 1726882495.60612: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15627 1726882495.60685: in run() - task 0e448fcc-3ce9-2847-7723-000000000067 15627 1726882495.60697: variable 'ansible_search_path' from source: unknown 15627 1726882495.60701: variable 'ansible_search_path' from source: unknown 15627 1726882495.60730: calling self._execute() 15627 1726882495.60805: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882495.60809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882495.60820: variable 'omit' from source: magic vars 15627 1726882495.61084: variable 'ansible_distribution_major_version' from source: facts 15627 1726882495.61094: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882495.61176: variable 'network_provider' from source: set_fact 15627 1726882495.61180: Evaluated conditional (network_provider == "nm"): True 15627 1726882495.61244: variable '__network_wpa_supplicant_required' from source: role '' defaults 15627 1726882495.61307: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15627 1726882495.61420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882495.62895: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882495.62940: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882495.62969: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882495.62997: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882495.63018: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882495.63087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882495.63108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882495.63126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.63152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882495.63164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882495.63197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882495.63214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882495.63231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.63258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882495.63269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882495.63297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882495.63312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882495.63331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.63355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882495.63369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882495.63468: variable 'network_connections' from source: play vars 15627 1726882495.63476: variable 'profile' from source: play vars 15627 1726882495.63521: variable 'profile' from source: play vars 15627 1726882495.63524: variable 'interface' from source: set_fact 15627 1726882495.63570: variable 'interface' from source: set_fact 15627 1726882495.63618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15627 1726882495.63729: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15627 1726882495.63756: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15627 1726882495.63782: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15627 1726882495.63803: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15627 1726882495.63833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15627 1726882495.63848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15627 1726882495.63872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.63889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15627 1726882495.63925: variable '__network_wireless_connections_defined' from source: role '' defaults 15627 1726882495.64088: variable 'network_connections' from source: play vars 15627 1726882495.64091: variable 'profile' from source: play vars 15627 1726882495.64134: variable 'profile' from source: play vars 15627 1726882495.64137: variable 'interface' from source: set_fact 15627 1726882495.64184: variable 'interface' from source: set_fact 15627 1726882495.64206: Evaluated conditional (__network_wpa_supplicant_required): False 15627 1726882495.64209: when evaluation is False, skipping this task 15627 1726882495.64211: _execute() done 15627 1726882495.64220: dumping result to json 15627 1726882495.64224: done dumping result, returning 15627 1726882495.64226: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-2847-7723-000000000067] 15627 1726882495.64228: sending task result for task 0e448fcc-3ce9-2847-7723-000000000067 15627 1726882495.64311: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000067 15627 1726882495.64314: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15627 1726882495.64353: no more pending results, returning what we have 15627 1726882495.64356: results queue empty 15627 1726882495.64357: checking for any_errors_fatal 15627 1726882495.64382: done checking for any_errors_fatal 15627 1726882495.64383: checking for max_fail_percentage 15627 1726882495.64385: done checking for max_fail_percentage 15627 1726882495.64386: checking to see if all hosts have failed and the running result is not ok 15627 1726882495.64387: done checking to see if all hosts have failed 15627 1726882495.64387: getting the remaining hosts for this loop 15627 1726882495.64389: done getting the remaining hosts for this loop 15627 1726882495.64397: getting the next task for host managed_node1 15627 1726882495.64403: done getting next task for host managed_node1 15627 1726882495.64406: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15627 1726882495.64408: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882495.64420: getting variables 15627 1726882495.64421: in VariableManager get_vars() 15627 1726882495.64451: Calling all_inventory to load vars for managed_node1 15627 1726882495.64454: Calling groups_inventory to load vars for managed_node1 15627 1726882495.64456: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882495.64466: Calling all_plugins_play to load vars for managed_node1 15627 1726882495.64469: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882495.64472: Calling groups_plugins_play to load vars for managed_node1 15627 1726882495.65295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882495.66824: done with get_vars() 15627 1726882495.66838: done getting variables 15627 1726882495.66893: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:34:55 -0400 (0:00:00.067) 0:00:35.420 ****** 15627 1726882495.66915: entering _queue_task() for managed_node1/service 15627 1726882495.67103: worker is 1 (out of 1 available) 15627 1726882495.67115: exiting _queue_task() for managed_node1/service 15627 1726882495.67127: done queuing things up, now waiting for results queue to drain 15627 1726882495.67128: waiting for pending results... 15627 1726882495.67310: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 15627 1726882495.67384: in run() - task 0e448fcc-3ce9-2847-7723-000000000068 15627 1726882495.67396: variable 'ansible_search_path' from source: unknown 15627 1726882495.67399: variable 'ansible_search_path' from source: unknown 15627 1726882495.67426: calling self._execute() 15627 1726882495.67506: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882495.67510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882495.67519: variable 'omit' from source: magic vars 15627 1726882495.67783: variable 'ansible_distribution_major_version' from source: facts 15627 1726882495.67792: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882495.67869: variable 'network_provider' from source: set_fact 15627 1726882495.67873: Evaluated conditional (network_provider == "initscripts"): False 15627 1726882495.67877: when evaluation is False, skipping this task 15627 1726882495.67880: _execute() done 15627 1726882495.67882: dumping result to json 15627 1726882495.67885: done dumping result, returning 15627 1726882495.67891: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-2847-7723-000000000068] 15627 1726882495.67900: sending task result for task 0e448fcc-3ce9-2847-7723-000000000068 15627 1726882495.67979: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000068 15627 1726882495.67982: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15627 1726882495.68044: no more pending results, returning what we have 15627 1726882495.68048: results queue empty 15627 1726882495.68049: checking for any_errors_fatal 15627 1726882495.68054: done checking for any_errors_fatal 15627 1726882495.68055: checking for max_fail_percentage 15627 1726882495.68056: done checking for max_fail_percentage 15627 1726882495.68057: checking to see if all hosts have failed and the running result is not ok 15627 1726882495.68058: done checking to see if all hosts have failed 15627 1726882495.68059: getting the remaining hosts for this loop 15627 1726882495.68060: done getting the remaining hosts for this loop 15627 1726882495.68062: getting the next task for host managed_node1 15627 1726882495.68070: done getting next task for host managed_node1 15627 1726882495.68074: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15627 1726882495.68076: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882495.68089: getting variables 15627 1726882495.68091: in VariableManager get_vars() 15627 1726882495.68124: Calling all_inventory to load vars for managed_node1 15627 1726882495.68126: Calling groups_inventory to load vars for managed_node1 15627 1726882495.68128: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882495.68136: Calling all_plugins_play to load vars for managed_node1 15627 1726882495.68138: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882495.68141: Calling groups_plugins_play to load vars for managed_node1 15627 1726882495.69147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882495.70473: done with get_vars() 15627 1726882495.70487: done getting variables 15627 1726882495.70523: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:34:55 -0400 (0:00:00.036) 0:00:35.457 ****** 15627 1726882495.70544: entering _queue_task() for managed_node1/copy 15627 1726882495.70713: worker is 1 (out of 1 available) 15627 1726882495.70726: exiting _queue_task() for managed_node1/copy 15627 1726882495.70738: done queuing things up, now waiting for results queue to drain 15627 1726882495.70739: waiting for pending results... 15627 1726882495.70895: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15627 1726882495.70961: in run() - task 0e448fcc-3ce9-2847-7723-000000000069 15627 1726882495.70977: variable 'ansible_search_path' from source: unknown 15627 1726882495.70981: variable 'ansible_search_path' from source: unknown 15627 1726882495.71009: calling self._execute() 15627 1726882495.71082: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882495.71090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882495.71098: variable 'omit' from source: magic vars 15627 1726882495.71352: variable 'ansible_distribution_major_version' from source: facts 15627 1726882495.71365: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882495.71439: variable 'network_provider' from source: set_fact 15627 1726882495.71443: Evaluated conditional (network_provider == "initscripts"): False 15627 1726882495.71446: when evaluation is False, skipping this task 15627 1726882495.71449: _execute() done 15627 1726882495.71452: dumping result to json 15627 1726882495.71459: done dumping result, returning 15627 1726882495.71468: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-2847-7723-000000000069] 15627 1726882495.71473: sending task result for task 0e448fcc-3ce9-2847-7723-000000000069 15627 1726882495.71553: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000069 15627 1726882495.71556: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15627 1726882495.71601: no more pending results, returning what we have 15627 1726882495.71605: results queue empty 15627 1726882495.71606: checking for any_errors_fatal 15627 1726882495.71609: done checking for any_errors_fatal 15627 1726882495.71610: checking for max_fail_percentage 15627 1726882495.71612: done checking for max_fail_percentage 15627 1726882495.71613: checking to see if all hosts have failed and the running result is not ok 15627 1726882495.71614: done checking to see if all hosts have failed 15627 1726882495.71614: getting the remaining hosts for this loop 15627 1726882495.71615: done getting the remaining hosts for this loop 15627 1726882495.71618: getting the next task for host managed_node1 15627 1726882495.71623: done getting next task for host managed_node1 15627 1726882495.71626: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15627 1726882495.71628: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882495.71639: getting variables 15627 1726882495.71640: in VariableManager get_vars() 15627 1726882495.71671: Calling all_inventory to load vars for managed_node1 15627 1726882495.71673: Calling groups_inventory to load vars for managed_node1 15627 1726882495.71675: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882495.71683: Calling all_plugins_play to load vars for managed_node1 15627 1726882495.71685: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882495.71688: Calling groups_plugins_play to load vars for managed_node1 15627 1726882495.72430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882495.73443: done with get_vars() 15627 1726882495.73458: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:34:55 -0400 (0:00:00.029) 0:00:35.486 ****** 15627 1726882495.73512: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15627 1726882495.73716: worker is 1 (out of 1 available) 15627 1726882495.73729: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15627 1726882495.73741: done queuing things up, now waiting for results queue to drain 15627 1726882495.73742: waiting for pending results... 15627 1726882495.74012: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15627 1726882495.74119: in run() - task 0e448fcc-3ce9-2847-7723-00000000006a 15627 1726882495.74141: variable 'ansible_search_path' from source: unknown 15627 1726882495.74149: variable 'ansible_search_path' from source: unknown 15627 1726882495.74197: calling self._execute() 15627 1726882495.74299: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882495.74311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882495.74328: variable 'omit' from source: magic vars 15627 1726882495.74713: variable 'ansible_distribution_major_version' from source: facts 15627 1726882495.74736: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882495.74747: variable 'omit' from source: magic vars 15627 1726882495.74787: variable 'omit' from source: magic vars 15627 1726882495.74957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15627 1726882495.77374: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15627 1726882495.77443: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15627 1726882495.77486: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15627 1726882495.77526: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15627 1726882495.77562: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15627 1726882495.77643: variable 'network_provider' from source: set_fact 15627 1726882495.77761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15627 1726882495.77804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15627 1726882495.77831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15627 1726882495.77878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15627 1726882495.77895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15627 1726882495.77965: variable 'omit' from source: magic vars 15627 1726882495.78069: variable 'omit' from source: magic vars 15627 1726882495.78168: variable 'network_connections' from source: play vars 15627 1726882495.78239: variable 'profile' from source: play vars 15627 1726882495.78303: variable 'profile' from source: play vars 15627 1726882495.78311: variable 'interface' from source: set_fact 15627 1726882495.78368: variable 'interface' from source: set_fact 15627 1726882495.78500: variable 'omit' from source: magic vars 15627 1726882495.78516: variable '__lsr_ansible_managed' from source: task vars 15627 1726882495.78581: variable '__lsr_ansible_managed' from source: task vars 15627 1726882495.78766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15627 1726882495.78999: Loaded config def from plugin (lookup/template) 15627 1726882495.79009: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15627 1726882495.79043: File lookup term: get_ansible_managed.j2 15627 1726882495.79050: variable 'ansible_search_path' from source: unknown 15627 1726882495.79063: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15627 1726882495.79082: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15627 1726882495.79102: variable 'ansible_search_path' from source: unknown 15627 1726882495.85609: variable 'ansible_managed' from source: unknown 15627 1726882495.85752: variable 'omit' from source: magic vars 15627 1726882495.85790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882495.85821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882495.85846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882495.85876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882495.85895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882495.85928: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882495.85937: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882495.85946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882495.86052: Set connection var ansible_timeout to 10 15627 1726882495.86067: Set connection var ansible_shell_executable to /bin/sh 15627 1726882495.86076: Set connection var ansible_connection to ssh 15627 1726882495.86085: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882495.86095: Set connection var ansible_pipelining to False 15627 1726882495.86103: Set connection var ansible_shell_type to sh 15627 1726882495.86129: variable 'ansible_shell_executable' from source: unknown 15627 1726882495.86136: variable 'ansible_connection' from source: unknown 15627 1726882495.86143: variable 'ansible_module_compression' from source: unknown 15627 1726882495.86149: variable 'ansible_shell_type' from source: unknown 15627 1726882495.86155: variable 'ansible_shell_executable' from source: unknown 15627 1726882495.86161: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882495.86171: variable 'ansible_pipelining' from source: unknown 15627 1726882495.86177: variable 'ansible_timeout' from source: unknown 15627 1726882495.86184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882495.86324: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882495.86349: variable 'omit' from source: magic vars 15627 1726882495.86359: starting attempt loop 15627 1726882495.86368: running the handler 15627 1726882495.86386: _low_level_execute_command(): starting 15627 1726882495.86399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882495.87130: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882495.87145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882495.87160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882495.87181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.87227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882495.87240: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882495.87253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.87273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882495.87285: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882495.87300: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882495.87314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882495.87329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882495.87345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.87358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882495.87373: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882495.87388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.87469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882495.87495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882495.87515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882495.87649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882495.89320: stdout chunk (state=3): >>>/root <<< 15627 1726882495.89504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882495.89507: stdout chunk (state=3): >>><<< 15627 1726882495.89509: stderr chunk (state=3): >>><<< 15627 1726882495.89614: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882495.89619: _low_level_execute_command(): starting 15627 1726882495.89623: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882495.895291-17140-108548001978091 `" && echo ansible-tmp-1726882495.895291-17140-108548001978091="` echo /root/.ansible/tmp/ansible-tmp-1726882495.895291-17140-108548001978091 `" ) && sleep 0' 15627 1726882495.90204: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882495.90218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882495.90234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882495.90253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.90303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882495.90316: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882495.90331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.90349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882495.90362: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882495.90379: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882495.90395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882495.90411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882495.90428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.90441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882495.90454: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882495.90472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.90552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882495.90576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882495.90599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882495.90731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882495.92633: stdout chunk (state=3): >>>ansible-tmp-1726882495.895291-17140-108548001978091=/root/.ansible/tmp/ansible-tmp-1726882495.895291-17140-108548001978091 <<< 15627 1726882495.92747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882495.92838: stderr chunk (state=3): >>><<< 15627 1726882495.92850: stdout chunk (state=3): >>><<< 15627 1726882495.93079: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882495.895291-17140-108548001978091=/root/.ansible/tmp/ansible-tmp-1726882495.895291-17140-108548001978091 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882495.93086: variable 'ansible_module_compression' from source: unknown 15627 1726882495.93089: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15627 1726882495.93091: variable 'ansible_facts' from source: unknown 15627 1726882495.93122: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882495.895291-17140-108548001978091/AnsiballZ_network_connections.py 15627 1726882495.93343: Sending initial data 15627 1726882495.93396: Sent initial data (167 bytes) 15627 1726882495.96186: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882495.96202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882495.96218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882495.96239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.96283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882495.96295: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882495.96331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.96352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882495.96367: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882495.96380: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882495.96392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882495.96406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882495.96424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882495.96437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882495.96452: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882495.96468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882495.96548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882495.96576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882495.96594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882495.96723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882495.98552: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882495.98647: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882495.98739: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmp_qb3vq69 /root/.ansible/tmp/ansible-tmp-1726882495.895291-17140-108548001978091/AnsiballZ_network_connections.py <<< 15627 1726882495.98834: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882496.01082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882496.01292: stderr chunk (state=3): >>><<< 15627 1726882496.01295: stdout chunk (state=3): >>><<< 15627 1726882496.01297: done transferring module to remote 15627 1726882496.01300: _low_level_execute_command(): starting 15627 1726882496.01302: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882495.895291-17140-108548001978091/ /root/.ansible/tmp/ansible-tmp-1726882495.895291-17140-108548001978091/AnsiballZ_network_connections.py && sleep 0' 15627 1726882496.02125: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.02128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.02171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.02175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.02177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.02179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.02241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882496.02884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882496.03195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882496.04946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882496.05023: stderr chunk (state=3): >>><<< 15627 1726882496.05027: stdout chunk (state=3): >>><<< 15627 1726882496.05076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882496.05080: _low_level_execute_command(): starting 15627 1726882496.05082: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882495.895291-17140-108548001978091/AnsiballZ_network_connections.py && sleep 0' 15627 1726882496.06709: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.06713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.06729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.06882: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882496.06893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.06909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882496.06917: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882496.06923: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882496.06932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.06941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.06953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.06967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.06974: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882496.06989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.07067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882496.07102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882496.07215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882496.07434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882496.31095: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_e7eyxvts/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_e7eyxvts/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/8673f01a-a0f2-4871-9987-eca35b758d19: error=unknown <<< 15627 1726882496.31272: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15627 1726882496.32787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882496.32790: stdout chunk (state=3): >>><<< 15627 1726882496.32803: stderr chunk (state=3): >>><<< 15627 1726882496.32819: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_e7eyxvts/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_e7eyxvts/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/8673f01a-a0f2-4871-9987-eca35b758d19: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882496.32861: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882495.895291-17140-108548001978091/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882496.32872: _low_level_execute_command(): starting 15627 1726882496.32880: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882495.895291-17140-108548001978091/ > /dev/null 2>&1 && sleep 0' 15627 1726882496.33529: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882496.33539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.33549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.33567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.33611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.33619: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882496.33632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.33646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882496.33654: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882496.33665: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882496.33674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.33685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.33695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.33702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.33709: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882496.33718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.33809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882496.33813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882496.33820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882496.33939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882496.35770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882496.35816: stderr chunk (state=3): >>><<< 15627 1726882496.35820: stdout chunk (state=3): >>><<< 15627 1726882496.35836: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882496.35842: handler run complete 15627 1726882496.35878: attempt loop complete, returning result 15627 1726882496.35883: _execute() done 15627 1726882496.35885: dumping result to json 15627 1726882496.35888: done dumping result, returning 15627 1726882496.35908: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-2847-7723-00000000006a] 15627 1726882496.35911: sending task result for task 0e448fcc-3ce9-2847-7723-00000000006a 15627 1726882496.36009: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000006a 15627 1726882496.36011: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15627 1726882496.36097: no more pending results, returning what we have 15627 1726882496.36100: results queue empty 15627 1726882496.36101: checking for any_errors_fatal 15627 1726882496.36106: done checking for any_errors_fatal 15627 1726882496.36107: checking for max_fail_percentage 15627 1726882496.36109: done checking for max_fail_percentage 15627 1726882496.36109: checking to see if all hosts have failed and the running result is not ok 15627 1726882496.36111: done checking to see if all hosts have failed 15627 1726882496.36111: getting the remaining hosts for this loop 15627 1726882496.36113: done getting the remaining hosts for this loop 15627 1726882496.36117: getting the next task for host managed_node1 15627 1726882496.36124: done getting next task for host managed_node1 15627 1726882496.36128: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15627 1726882496.36130: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882496.36138: getting variables 15627 1726882496.36140: in VariableManager get_vars() 15627 1726882496.36181: Calling all_inventory to load vars for managed_node1 15627 1726882496.36184: Calling groups_inventory to load vars for managed_node1 15627 1726882496.36186: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882496.36195: Calling all_plugins_play to load vars for managed_node1 15627 1726882496.36198: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882496.36200: Calling groups_plugins_play to load vars for managed_node1 15627 1726882496.37795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882496.41599: done with get_vars() 15627 1726882496.41630: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:34:56 -0400 (0:00:00.683) 0:00:36.170 ****** 15627 1726882496.41899: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15627 1726882496.42760: worker is 1 (out of 1 available) 15627 1726882496.42775: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15627 1726882496.42788: done queuing things up, now waiting for results queue to drain 15627 1726882496.42789: waiting for pending results... 15627 1726882496.43229: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 15627 1726882496.43335: in run() - task 0e448fcc-3ce9-2847-7723-00000000006b 15627 1726882496.43358: variable 'ansible_search_path' from source: unknown 15627 1726882496.43371: variable 'ansible_search_path' from source: unknown 15627 1726882496.43413: calling self._execute() 15627 1726882496.43515: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882496.43527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882496.43543: variable 'omit' from source: magic vars 15627 1726882496.44031: variable 'ansible_distribution_major_version' from source: facts 15627 1726882496.44050: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882496.44173: variable 'network_state' from source: role '' defaults 15627 1726882496.44191: Evaluated conditional (network_state != {}): False 15627 1726882496.44198: when evaluation is False, skipping this task 15627 1726882496.44205: _execute() done 15627 1726882496.44211: dumping result to json 15627 1726882496.44218: done dumping result, returning 15627 1726882496.44234: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-2847-7723-00000000006b] 15627 1726882496.44246: sending task result for task 0e448fcc-3ce9-2847-7723-00000000006b skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15627 1726882496.44391: no more pending results, returning what we have 15627 1726882496.44395: results queue empty 15627 1726882496.44396: checking for any_errors_fatal 15627 1726882496.44409: done checking for any_errors_fatal 15627 1726882496.44410: checking for max_fail_percentage 15627 1726882496.44411: done checking for max_fail_percentage 15627 1726882496.44412: checking to see if all hosts have failed and the running result is not ok 15627 1726882496.44414: done checking to see if all hosts have failed 15627 1726882496.44414: getting the remaining hosts for this loop 15627 1726882496.44416: done getting the remaining hosts for this loop 15627 1726882496.44420: getting the next task for host managed_node1 15627 1726882496.44428: done getting next task for host managed_node1 15627 1726882496.44432: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15627 1726882496.44434: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882496.44448: getting variables 15627 1726882496.44450: in VariableManager get_vars() 15627 1726882496.44491: Calling all_inventory to load vars for managed_node1 15627 1726882496.44495: Calling groups_inventory to load vars for managed_node1 15627 1726882496.44497: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882496.44511: Calling all_plugins_play to load vars for managed_node1 15627 1726882496.44514: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882496.44517: Calling groups_plugins_play to load vars for managed_node1 15627 1726882496.46102: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000006b 15627 1726882496.46107: WORKER PROCESS EXITING 15627 1726882496.47137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882496.49349: done with get_vars() 15627 1726882496.49449: done getting variables 15627 1726882496.49656: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:34:56 -0400 (0:00:00.077) 0:00:36.248 ****** 15627 1726882496.49690: entering _queue_task() for managed_node1/debug 15627 1726882496.50077: worker is 1 (out of 1 available) 15627 1726882496.50090: exiting _queue_task() for managed_node1/debug 15627 1726882496.50104: done queuing things up, now waiting for results queue to drain 15627 1726882496.50105: waiting for pending results... 15627 1726882496.50407: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15627 1726882496.50518: in run() - task 0e448fcc-3ce9-2847-7723-00000000006c 15627 1726882496.50538: variable 'ansible_search_path' from source: unknown 15627 1726882496.50551: variable 'ansible_search_path' from source: unknown 15627 1726882496.50596: calling self._execute() 15627 1726882496.50713: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882496.50743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882496.50761: variable 'omit' from source: magic vars 15627 1726882496.51357: variable 'ansible_distribution_major_version' from source: facts 15627 1726882496.51378: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882496.51389: variable 'omit' from source: magic vars 15627 1726882496.51435: variable 'omit' from source: magic vars 15627 1726882496.51522: variable 'omit' from source: magic vars 15627 1726882496.51595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882496.51650: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882496.51683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882496.51776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882496.51846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882496.51916: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882496.51926: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882496.51934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882496.52090: Set connection var ansible_timeout to 10 15627 1726882496.52105: Set connection var ansible_shell_executable to /bin/sh 15627 1726882496.52117: Set connection var ansible_connection to ssh 15627 1726882496.52150: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882496.52160: Set connection var ansible_pipelining to False 15627 1726882496.52169: Set connection var ansible_shell_type to sh 15627 1726882496.52194: variable 'ansible_shell_executable' from source: unknown 15627 1726882496.52202: variable 'ansible_connection' from source: unknown 15627 1726882496.52268: variable 'ansible_module_compression' from source: unknown 15627 1726882496.52276: variable 'ansible_shell_type' from source: unknown 15627 1726882496.52283: variable 'ansible_shell_executable' from source: unknown 15627 1726882496.52289: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882496.52298: variable 'ansible_pipelining' from source: unknown 15627 1726882496.52323: variable 'ansible_timeout' from source: unknown 15627 1726882496.52333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882496.52651: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882496.52676: variable 'omit' from source: magic vars 15627 1726882496.52711: starting attempt loop 15627 1726882496.52719: running the handler 15627 1726882496.52961: variable '__network_connections_result' from source: set_fact 15627 1726882496.53021: handler run complete 15627 1726882496.53044: attempt loop complete, returning result 15627 1726882496.53078: _execute() done 15627 1726882496.53091: dumping result to json 15627 1726882496.53105: done dumping result, returning 15627 1726882496.53124: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-2847-7723-00000000006c] 15627 1726882496.53127: sending task result for task 0e448fcc-3ce9-2847-7723-00000000006c 15627 1726882496.53234: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000006c 15627 1726882496.53238: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 15627 1726882496.53429: no more pending results, returning what we have 15627 1726882496.53433: results queue empty 15627 1726882496.53434: checking for any_errors_fatal 15627 1726882496.53439: done checking for any_errors_fatal 15627 1726882496.53440: checking for max_fail_percentage 15627 1726882496.53441: done checking for max_fail_percentage 15627 1726882496.53442: checking to see if all hosts have failed and the running result is not ok 15627 1726882496.53443: done checking to see if all hosts have failed 15627 1726882496.53466: getting the remaining hosts for this loop 15627 1726882496.53468: done getting the remaining hosts for this loop 15627 1726882496.53472: getting the next task for host managed_node1 15627 1726882496.53493: done getting next task for host managed_node1 15627 1726882496.53497: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15627 1726882496.53500: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882496.53537: getting variables 15627 1726882496.53539: in VariableManager get_vars() 15627 1726882496.53571: Calling all_inventory to load vars for managed_node1 15627 1726882496.53573: Calling groups_inventory to load vars for managed_node1 15627 1726882496.53574: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882496.53584: Calling all_plugins_play to load vars for managed_node1 15627 1726882496.53587: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882496.53591: Calling groups_plugins_play to load vars for managed_node1 15627 1726882496.55361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882496.57313: done with get_vars() 15627 1726882496.57334: done getting variables 15627 1726882496.57395: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:34:56 -0400 (0:00:00.077) 0:00:36.325 ****** 15627 1726882496.57426: entering _queue_task() for managed_node1/debug 15627 1726882496.57703: worker is 1 (out of 1 available) 15627 1726882496.57728: exiting _queue_task() for managed_node1/debug 15627 1726882496.57740: done queuing things up, now waiting for results queue to drain 15627 1726882496.57742: waiting for pending results... 15627 1726882496.57931: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15627 1726882496.58045: in run() - task 0e448fcc-3ce9-2847-7723-00000000006d 15627 1726882496.58085: variable 'ansible_search_path' from source: unknown 15627 1726882496.58089: variable 'ansible_search_path' from source: unknown 15627 1726882496.58123: calling self._execute() 15627 1726882496.58223: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882496.58227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882496.58237: variable 'omit' from source: magic vars 15627 1726882496.58708: variable 'ansible_distribution_major_version' from source: facts 15627 1726882496.58728: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882496.58746: variable 'omit' from source: magic vars 15627 1726882496.58800: variable 'omit' from source: magic vars 15627 1726882496.58838: variable 'omit' from source: magic vars 15627 1726882496.58889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882496.58928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882496.58962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882496.58998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882496.59016: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882496.59049: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882496.59062: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882496.59084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882496.59208: Set connection var ansible_timeout to 10 15627 1726882496.59222: Set connection var ansible_shell_executable to /bin/sh 15627 1726882496.59235: Set connection var ansible_connection to ssh 15627 1726882496.59250: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882496.59274: Set connection var ansible_pipelining to False 15627 1726882496.59289: Set connection var ansible_shell_type to sh 15627 1726882496.59351: variable 'ansible_shell_executable' from source: unknown 15627 1726882496.59376: variable 'ansible_connection' from source: unknown 15627 1726882496.59391: variable 'ansible_module_compression' from source: unknown 15627 1726882496.59399: variable 'ansible_shell_type' from source: unknown 15627 1726882496.59407: variable 'ansible_shell_executable' from source: unknown 15627 1726882496.59418: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882496.59426: variable 'ansible_pipelining' from source: unknown 15627 1726882496.59432: variable 'ansible_timeout' from source: unknown 15627 1726882496.59439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882496.59609: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882496.59643: variable 'omit' from source: magic vars 15627 1726882496.59656: starting attempt loop 15627 1726882496.59667: running the handler 15627 1726882496.59732: variable '__network_connections_result' from source: set_fact 15627 1726882496.59826: variable '__network_connections_result' from source: set_fact 15627 1726882496.59982: handler run complete 15627 1726882496.60012: attempt loop complete, returning result 15627 1726882496.60019: _execute() done 15627 1726882496.60025: dumping result to json 15627 1726882496.60033: done dumping result, returning 15627 1726882496.60045: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-2847-7723-00000000006d] 15627 1726882496.60057: sending task result for task 0e448fcc-3ce9-2847-7723-00000000006d 15627 1726882496.60171: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000006d ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15627 1726882496.60256: no more pending results, returning what we have 15627 1726882496.60258: results queue empty 15627 1726882496.60260: checking for any_errors_fatal 15627 1726882496.60267: done checking for any_errors_fatal 15627 1726882496.60267: checking for max_fail_percentage 15627 1726882496.60269: done checking for max_fail_percentage 15627 1726882496.60270: checking to see if all hosts have failed and the running result is not ok 15627 1726882496.60271: done checking to see if all hosts have failed 15627 1726882496.60272: getting the remaining hosts for this loop 15627 1726882496.60273: done getting the remaining hosts for this loop 15627 1726882496.60278: getting the next task for host managed_node1 15627 1726882496.60286: done getting next task for host managed_node1 15627 1726882496.60289: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15627 1726882496.60291: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882496.60324: getting variables 15627 1726882496.60326: in VariableManager get_vars() 15627 1726882496.60361: Calling all_inventory to load vars for managed_node1 15627 1726882496.60365: Calling groups_inventory to load vars for managed_node1 15627 1726882496.60367: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882496.60379: Calling all_plugins_play to load vars for managed_node1 15627 1726882496.60382: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882496.60386: Calling groups_plugins_play to load vars for managed_node1 15627 1726882496.60932: WORKER PROCESS EXITING 15627 1726882496.62183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882496.64512: done with get_vars() 15627 1726882496.64560: done getting variables 15627 1726882496.64631: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:34:56 -0400 (0:00:00.072) 0:00:36.398 ****** 15627 1726882496.64697: entering _queue_task() for managed_node1/debug 15627 1726882496.65067: worker is 1 (out of 1 available) 15627 1726882496.65081: exiting _queue_task() for managed_node1/debug 15627 1726882496.65098: done queuing things up, now waiting for results queue to drain 15627 1726882496.65099: waiting for pending results... 15627 1726882496.65443: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15627 1726882496.65575: in run() - task 0e448fcc-3ce9-2847-7723-00000000006e 15627 1726882496.65595: variable 'ansible_search_path' from source: unknown 15627 1726882496.65602: variable 'ansible_search_path' from source: unknown 15627 1726882496.65645: calling self._execute() 15627 1726882496.65758: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882496.65775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882496.65793: variable 'omit' from source: magic vars 15627 1726882496.66192: variable 'ansible_distribution_major_version' from source: facts 15627 1726882496.66212: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882496.66353: variable 'network_state' from source: role '' defaults 15627 1726882496.66376: Evaluated conditional (network_state != {}): False 15627 1726882496.66384: when evaluation is False, skipping this task 15627 1726882496.66392: _execute() done 15627 1726882496.66401: dumping result to json 15627 1726882496.66412: done dumping result, returning 15627 1726882496.66424: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-2847-7723-00000000006e] 15627 1726882496.66439: sending task result for task 0e448fcc-3ce9-2847-7723-00000000006e skipping: [managed_node1] => { "false_condition": "network_state != {}" } 15627 1726882496.66592: no more pending results, returning what we have 15627 1726882496.66596: results queue empty 15627 1726882496.66597: checking for any_errors_fatal 15627 1726882496.66608: done checking for any_errors_fatal 15627 1726882496.66609: checking for max_fail_percentage 15627 1726882496.66611: done checking for max_fail_percentage 15627 1726882496.66612: checking to see if all hosts have failed and the running result is not ok 15627 1726882496.66613: done checking to see if all hosts have failed 15627 1726882496.66614: getting the remaining hosts for this loop 15627 1726882496.66616: done getting the remaining hosts for this loop 15627 1726882496.66620: getting the next task for host managed_node1 15627 1726882496.66627: done getting next task for host managed_node1 15627 1726882496.66632: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15627 1726882496.66634: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882496.66649: getting variables 15627 1726882496.66651: in VariableManager get_vars() 15627 1726882496.66694: Calling all_inventory to load vars for managed_node1 15627 1726882496.66697: Calling groups_inventory to load vars for managed_node1 15627 1726882496.66700: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882496.66712: Calling all_plugins_play to load vars for managed_node1 15627 1726882496.66716: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882496.66719: Calling groups_plugins_play to load vars for managed_node1 15627 1726882496.67684: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000006e 15627 1726882496.67687: WORKER PROCESS EXITING 15627 1726882496.68681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882496.70492: done with get_vars() 15627 1726882496.70514: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:34:56 -0400 (0:00:00.059) 0:00:36.457 ****** 15627 1726882496.70612: entering _queue_task() for managed_node1/ping 15627 1726882496.70907: worker is 1 (out of 1 available) 15627 1726882496.70919: exiting _queue_task() for managed_node1/ping 15627 1726882496.70931: done queuing things up, now waiting for results queue to drain 15627 1726882496.70933: waiting for pending results... 15627 1726882496.71228: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15627 1726882496.71341: in run() - task 0e448fcc-3ce9-2847-7723-00000000006f 15627 1726882496.71371: variable 'ansible_search_path' from source: unknown 15627 1726882496.71383: variable 'ansible_search_path' from source: unknown 15627 1726882496.71421: calling self._execute() 15627 1726882496.71526: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882496.71537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882496.71551: variable 'omit' from source: magic vars 15627 1726882496.71959: variable 'ansible_distribution_major_version' from source: facts 15627 1726882496.71979: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882496.71991: variable 'omit' from source: magic vars 15627 1726882496.72041: variable 'omit' from source: magic vars 15627 1726882496.72084: variable 'omit' from source: magic vars 15627 1726882496.72133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882496.72181: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882496.72206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882496.72233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882496.72258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882496.72294: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882496.72303: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882496.72311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882496.72423: Set connection var ansible_timeout to 10 15627 1726882496.72442: Set connection var ansible_shell_executable to /bin/sh 15627 1726882496.72451: Set connection var ansible_connection to ssh 15627 1726882496.72464: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882496.72478: Set connection var ansible_pipelining to False 15627 1726882496.72485: Set connection var ansible_shell_type to sh 15627 1726882496.72513: variable 'ansible_shell_executable' from source: unknown 15627 1726882496.72521: variable 'ansible_connection' from source: unknown 15627 1726882496.72529: variable 'ansible_module_compression' from source: unknown 15627 1726882496.72536: variable 'ansible_shell_type' from source: unknown 15627 1726882496.72548: variable 'ansible_shell_executable' from source: unknown 15627 1726882496.72559: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882496.72571: variable 'ansible_pipelining' from source: unknown 15627 1726882496.72582: variable 'ansible_timeout' from source: unknown 15627 1726882496.72591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882496.72812: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882496.72828: variable 'omit' from source: magic vars 15627 1726882496.72837: starting attempt loop 15627 1726882496.72843: running the handler 15627 1726882496.72864: _low_level_execute_command(): starting 15627 1726882496.72880: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882496.73696: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882496.73712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.73729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.73751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.73805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.73818: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882496.73832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.73851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882496.73873: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882496.73889: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882496.73903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.73916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.73931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.73943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.73958: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882496.73976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.74059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882496.74078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882496.74096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882496.74231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882496.75904: stdout chunk (state=3): >>>/root <<< 15627 1726882496.76068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882496.76075: stdout chunk (state=3): >>><<< 15627 1726882496.76083: stderr chunk (state=3): >>><<< 15627 1726882496.76103: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882496.76117: _low_level_execute_command(): starting 15627 1726882496.76124: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882496.7610283-17185-2478483227758 `" && echo ansible-tmp-1726882496.7610283-17185-2478483227758="` echo /root/.ansible/tmp/ansible-tmp-1726882496.7610283-17185-2478483227758 `" ) && sleep 0' 15627 1726882496.76713: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882496.76721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.76731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.76744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.76783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.76790: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882496.76801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.76814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882496.76821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882496.76828: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882496.76836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.76845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.76859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.76862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.76874: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882496.76882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.76953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882496.76969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882496.76972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882496.77112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882496.78959: stdout chunk (state=3): >>>ansible-tmp-1726882496.7610283-17185-2478483227758=/root/.ansible/tmp/ansible-tmp-1726882496.7610283-17185-2478483227758 <<< 15627 1726882496.79076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882496.79138: stderr chunk (state=3): >>><<< 15627 1726882496.79141: stdout chunk (state=3): >>><<< 15627 1726882496.79159: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882496.7610283-17185-2478483227758=/root/.ansible/tmp/ansible-tmp-1726882496.7610283-17185-2478483227758 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882496.79201: variable 'ansible_module_compression' from source: unknown 15627 1726882496.79238: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15627 1726882496.79271: variable 'ansible_facts' from source: unknown 15627 1726882496.79343: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882496.7610283-17185-2478483227758/AnsiballZ_ping.py 15627 1726882496.79468: Sending initial data 15627 1726882496.79472: Sent initial data (151 bytes) 15627 1726882496.80348: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882496.80359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.80371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.80383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.80418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.80423: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882496.80433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.80445: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882496.80452: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882496.80458: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882496.80467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.80481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.80492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.80499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.80505: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882496.80515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.80592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882496.80610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882496.80613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882496.80743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882496.82449: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882496.82542: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882496.82641: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpohzy36eq /root/.ansible/tmp/ansible-tmp-1726882496.7610283-17185-2478483227758/AnsiballZ_ping.py <<< 15627 1726882496.82733: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882496.83980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882496.84111: stderr chunk (state=3): >>><<< 15627 1726882496.84114: stdout chunk (state=3): >>><<< 15627 1726882496.84132: done transferring module to remote 15627 1726882496.84142: _low_level_execute_command(): starting 15627 1726882496.84148: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882496.7610283-17185-2478483227758/ /root/.ansible/tmp/ansible-tmp-1726882496.7610283-17185-2478483227758/AnsiballZ_ping.py && sleep 0' 15627 1726882496.84759: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882496.84767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.84778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.84792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.84830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.84841: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882496.84851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.84866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882496.84875: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882496.84881: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882496.84889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.84898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.84909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.84917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.84924: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882496.84933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.85009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882496.85026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882496.85038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882496.85161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882496.86913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882496.86916: stdout chunk (state=3): >>><<< 15627 1726882496.86923: stderr chunk (state=3): >>><<< 15627 1726882496.86936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882496.86939: _low_level_execute_command(): starting 15627 1726882496.86944: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882496.7610283-17185-2478483227758/AnsiballZ_ping.py && sleep 0' 15627 1726882496.87486: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882496.87494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.87504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.87516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.87551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.87559: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882496.87570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.87581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882496.87589: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882496.87595: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882496.87604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882496.87612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882496.87622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882496.87633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882496.87636: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882496.87644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882496.87728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882496.87731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882496.87738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882496.87879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882497.00629: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15627 1726882497.01588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882497.01774: stderr chunk (state=3): >>><<< 15627 1726882497.01777: stdout chunk (state=3): >>><<< 15627 1726882497.01780: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882497.01783: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882496.7610283-17185-2478483227758/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882497.01789: _low_level_execute_command(): starting 15627 1726882497.01792: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882496.7610283-17185-2478483227758/ > /dev/null 2>&1 && sleep 0' 15627 1726882497.02574: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882497.02580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882497.02582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882497.02585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882497.02587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882497.02589: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882497.02593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882497.02596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882497.02598: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882497.02599: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882497.02601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882497.02603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882497.02605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882497.02607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882497.02609: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882497.02611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882497.02613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882497.02630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882497.02641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882497.02944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882497.04798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882497.04802: stdout chunk (state=3): >>><<< 15627 1726882497.04808: stderr chunk (state=3): >>><<< 15627 1726882497.04829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882497.04832: handler run complete 15627 1726882497.04849: attempt loop complete, returning result 15627 1726882497.04851: _execute() done 15627 1726882497.04856: dumping result to json 15627 1726882497.04859: done dumping result, returning 15627 1726882497.04867: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-2847-7723-00000000006f] 15627 1726882497.04872: sending task result for task 0e448fcc-3ce9-2847-7723-00000000006f 15627 1726882497.04968: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000006f 15627 1726882497.04971: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 15627 1726882497.05025: no more pending results, returning what we have 15627 1726882497.05028: results queue empty 15627 1726882497.05029: checking for any_errors_fatal 15627 1726882497.05036: done checking for any_errors_fatal 15627 1726882497.05037: checking for max_fail_percentage 15627 1726882497.05039: done checking for max_fail_percentage 15627 1726882497.05040: checking to see if all hosts have failed and the running result is not ok 15627 1726882497.05041: done checking to see if all hosts have failed 15627 1726882497.05042: getting the remaining hosts for this loop 15627 1726882497.05043: done getting the remaining hosts for this loop 15627 1726882497.05047: getting the next task for host managed_node1 15627 1726882497.05058: done getting next task for host managed_node1 15627 1726882497.05061: ^ task is: TASK: meta (role_complete) 15627 1726882497.05062: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882497.05074: getting variables 15627 1726882497.05075: in VariableManager get_vars() 15627 1726882497.05112: Calling all_inventory to load vars for managed_node1 15627 1726882497.05115: Calling groups_inventory to load vars for managed_node1 15627 1726882497.05117: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882497.05126: Calling all_plugins_play to load vars for managed_node1 15627 1726882497.05129: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882497.05131: Calling groups_plugins_play to load vars for managed_node1 15627 1726882497.06827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882497.09870: done with get_vars() 15627 1726882497.09893: done getting variables 15627 1726882497.09987: done queuing things up, now waiting for results queue to drain 15627 1726882497.09990: results queue empty 15627 1726882497.09990: checking for any_errors_fatal 15627 1726882497.09993: done checking for any_errors_fatal 15627 1726882497.09994: checking for max_fail_percentage 15627 1726882497.09995: done checking for max_fail_percentage 15627 1726882497.09996: checking to see if all hosts have failed and the running result is not ok 15627 1726882497.09997: done checking to see if all hosts have failed 15627 1726882497.09998: getting the remaining hosts for this loop 15627 1726882497.09999: done getting the remaining hosts for this loop 15627 1726882497.10001: getting the next task for host managed_node1 15627 1726882497.10005: done getting next task for host managed_node1 15627 1726882497.10006: ^ task is: TASK: meta (flush_handlers) 15627 1726882497.10008: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882497.10011: getting variables 15627 1726882497.10012: in VariableManager get_vars() 15627 1726882497.10028: Calling all_inventory to load vars for managed_node1 15627 1726882497.10030: Calling groups_inventory to load vars for managed_node1 15627 1726882497.10032: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882497.10041: Calling all_plugins_play to load vars for managed_node1 15627 1726882497.10044: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882497.10047: Calling groups_plugins_play to load vars for managed_node1 15627 1726882497.12147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882497.16056: done with get_vars() 15627 1726882497.16086: done getting variables 15627 1726882497.16145: in VariableManager get_vars() 15627 1726882497.16159: Calling all_inventory to load vars for managed_node1 15627 1726882497.16162: Calling groups_inventory to load vars for managed_node1 15627 1726882497.16166: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882497.16171: Calling all_plugins_play to load vars for managed_node1 15627 1726882497.16173: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882497.16176: Calling groups_plugins_play to load vars for managed_node1 15627 1726882497.17616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882497.28447: done with get_vars() 15627 1726882497.28482: done queuing things up, now waiting for results queue to drain 15627 1726882497.28485: results queue empty 15627 1726882497.28486: checking for any_errors_fatal 15627 1726882497.28487: done checking for any_errors_fatal 15627 1726882497.28488: checking for max_fail_percentage 15627 1726882497.28489: done checking for max_fail_percentage 15627 1726882497.28490: checking to see if all hosts have failed and the running result is not ok 15627 1726882497.28491: done checking to see if all hosts have failed 15627 1726882497.28492: getting the remaining hosts for this loop 15627 1726882497.28492: done getting the remaining hosts for this loop 15627 1726882497.28495: getting the next task for host managed_node1 15627 1726882497.28499: done getting next task for host managed_node1 15627 1726882497.28501: ^ task is: TASK: meta (flush_handlers) 15627 1726882497.28502: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882497.28505: getting variables 15627 1726882497.28506: in VariableManager get_vars() 15627 1726882497.28517: Calling all_inventory to load vars for managed_node1 15627 1726882497.28520: Calling groups_inventory to load vars for managed_node1 15627 1726882497.28522: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882497.28527: Calling all_plugins_play to load vars for managed_node1 15627 1726882497.28530: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882497.28533: Calling groups_plugins_play to load vars for managed_node1 15627 1726882497.30612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882497.32878: done with get_vars() 15627 1726882497.32903: done getting variables 15627 1726882497.33099: in VariableManager get_vars() 15627 1726882497.33111: Calling all_inventory to load vars for managed_node1 15627 1726882497.33116: Calling groups_inventory to load vars for managed_node1 15627 1726882497.33118: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882497.33139: Calling all_plugins_play to load vars for managed_node1 15627 1726882497.33142: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882497.33146: Calling groups_plugins_play to load vars for managed_node1 15627 1726882497.35449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882497.37693: done with get_vars() 15627 1726882497.37724: done queuing things up, now waiting for results queue to drain 15627 1726882497.37726: results queue empty 15627 1726882497.37727: checking for any_errors_fatal 15627 1726882497.37728: done checking for any_errors_fatal 15627 1726882497.37729: checking for max_fail_percentage 15627 1726882497.37730: done checking for max_fail_percentage 15627 1726882497.37731: checking to see if all hosts have failed and the running result is not ok 15627 1726882497.37732: done checking to see if all hosts have failed 15627 1726882497.37733: getting the remaining hosts for this loop 15627 1726882497.37734: done getting the remaining hosts for this loop 15627 1726882497.37737: getting the next task for host managed_node1 15627 1726882497.37740: done getting next task for host managed_node1 15627 1726882497.37741: ^ task is: None 15627 1726882497.37742: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882497.37744: done queuing things up, now waiting for results queue to drain 15627 1726882497.37745: results queue empty 15627 1726882497.37745: checking for any_errors_fatal 15627 1726882497.37746: done checking for any_errors_fatal 15627 1726882497.37747: checking for max_fail_percentage 15627 1726882497.37748: done checking for max_fail_percentage 15627 1726882497.37748: checking to see if all hosts have failed and the running result is not ok 15627 1726882497.37749: done checking to see if all hosts have failed 15627 1726882497.37750: getting the next task for host managed_node1 15627 1726882497.37752: done getting next task for host managed_node1 15627 1726882497.37756: ^ task is: None 15627 1726882497.37757: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882497.37796: in VariableManager get_vars() 15627 1726882497.37812: done with get_vars() 15627 1726882497.37818: in VariableManager get_vars() 15627 1726882497.37828: done with get_vars() 15627 1726882497.37832: variable 'omit' from source: magic vars 15627 1726882497.37942: variable 'task' from source: play vars 15627 1726882497.37975: in VariableManager get_vars() 15627 1726882497.37987: done with get_vars() 15627 1726882497.38010: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_absent.yml] ************************ 15627 1726882497.38244: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15627 1726882497.38273: getting the remaining hosts for this loop 15627 1726882497.38274: done getting the remaining hosts for this loop 15627 1726882497.38276: getting the next task for host managed_node1 15627 1726882497.38279: done getting next task for host managed_node1 15627 1726882497.38281: ^ task is: TASK: Gathering Facts 15627 1726882497.38282: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882497.38284: getting variables 15627 1726882497.38285: in VariableManager get_vars() 15627 1726882497.38293: Calling all_inventory to load vars for managed_node1 15627 1726882497.38296: Calling groups_inventory to load vars for managed_node1 15627 1726882497.38298: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882497.38303: Calling all_plugins_play to load vars for managed_node1 15627 1726882497.38305: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882497.38308: Calling groups_plugins_play to load vars for managed_node1 15627 1726882497.39679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882497.41693: done with get_vars() 15627 1726882497.41712: done getting variables 15627 1726882497.41765: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 21:34:57 -0400 (0:00:00.711) 0:00:37.169 ****** 15627 1726882497.41788: entering _queue_task() for managed_node1/gather_facts 15627 1726882497.42121: worker is 1 (out of 1 available) 15627 1726882497.42132: exiting _queue_task() for managed_node1/gather_facts 15627 1726882497.42144: done queuing things up, now waiting for results queue to drain 15627 1726882497.42146: waiting for pending results... 15627 1726882497.42445: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15627 1726882497.42566: in run() - task 0e448fcc-3ce9-2847-7723-00000000046e 15627 1726882497.42594: variable 'ansible_search_path' from source: unknown 15627 1726882497.42638: calling self._execute() 15627 1726882497.42741: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882497.42752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882497.42772: variable 'omit' from source: magic vars 15627 1726882497.43186: variable 'ansible_distribution_major_version' from source: facts 15627 1726882497.43203: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882497.43214: variable 'omit' from source: magic vars 15627 1726882497.43248: variable 'omit' from source: magic vars 15627 1726882497.43296: variable 'omit' from source: magic vars 15627 1726882497.43338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882497.43390: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882497.43415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882497.43436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882497.43459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882497.43502: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882497.43512: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882497.43520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882497.43631: Set connection var ansible_timeout to 10 15627 1726882497.43645: Set connection var ansible_shell_executable to /bin/sh 15627 1726882497.43657: Set connection var ansible_connection to ssh 15627 1726882497.43670: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882497.43684: Set connection var ansible_pipelining to False 15627 1726882497.43690: Set connection var ansible_shell_type to sh 15627 1726882497.43721: variable 'ansible_shell_executable' from source: unknown 15627 1726882497.43729: variable 'ansible_connection' from source: unknown 15627 1726882497.43735: variable 'ansible_module_compression' from source: unknown 15627 1726882497.43741: variable 'ansible_shell_type' from source: unknown 15627 1726882497.43747: variable 'ansible_shell_executable' from source: unknown 15627 1726882497.43753: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882497.43765: variable 'ansible_pipelining' from source: unknown 15627 1726882497.43772: variable 'ansible_timeout' from source: unknown 15627 1726882497.43779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882497.43970: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882497.43990: variable 'omit' from source: magic vars 15627 1726882497.44005: starting attempt loop 15627 1726882497.44012: running the handler 15627 1726882497.44037: variable 'ansible_facts' from source: unknown 15627 1726882497.44065: _low_level_execute_command(): starting 15627 1726882497.44078: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882497.44870: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882497.44890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882497.44905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882497.44924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882497.44971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882497.44988: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882497.45008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882497.45027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882497.45037: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882497.45047: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882497.45062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882497.45079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882497.45097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882497.45114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882497.45125: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882497.45139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882497.45226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882497.45243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882497.45260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882497.45395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882497.47069: stdout chunk (state=3): >>>/root <<< 15627 1726882497.47188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882497.47262: stderr chunk (state=3): >>><<< 15627 1726882497.47267: stdout chunk (state=3): >>><<< 15627 1726882497.47371: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882497.47376: _low_level_execute_command(): starting 15627 1726882497.47379: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882497.472875-17207-277811064115039 `" && echo ansible-tmp-1726882497.472875-17207-277811064115039="` echo /root/.ansible/tmp/ansible-tmp-1726882497.472875-17207-277811064115039 `" ) && sleep 0' 15627 1726882497.47985: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882497.47999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882497.48015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882497.48034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882497.48085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882497.48099: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882497.48113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882497.48131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882497.48143: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882497.48162: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882497.48179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882497.48193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882497.48210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882497.48223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882497.48235: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882497.48250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882497.48332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882497.48356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882497.48378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882497.48511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882497.50361: stdout chunk (state=3): >>>ansible-tmp-1726882497.472875-17207-277811064115039=/root/.ansible/tmp/ansible-tmp-1726882497.472875-17207-277811064115039 <<< 15627 1726882497.50542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882497.50545: stdout chunk (state=3): >>><<< 15627 1726882497.50547: stderr chunk (state=3): >>><<< 15627 1726882497.50869: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882497.472875-17207-277811064115039=/root/.ansible/tmp/ansible-tmp-1726882497.472875-17207-277811064115039 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882497.50873: variable 'ansible_module_compression' from source: unknown 15627 1726882497.50876: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15627 1726882497.50878: variable 'ansible_facts' from source: unknown 15627 1726882497.50880: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882497.472875-17207-277811064115039/AnsiballZ_setup.py 15627 1726882497.51038: Sending initial data 15627 1726882497.51041: Sent initial data (153 bytes) 15627 1726882497.52010: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882497.52023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882497.52036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882497.52051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882497.52101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882497.52112: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882497.52124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882497.52140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882497.52150: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882497.52165: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882497.52178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882497.52196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882497.52211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882497.52221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882497.52234: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882497.52247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882497.52332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882497.52351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882497.52371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882497.52493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882497.54203: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882497.54301: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882497.54415: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpn11yc2e2 /root/.ansible/tmp/ansible-tmp-1726882497.472875-17207-277811064115039/AnsiballZ_setup.py <<< 15627 1726882497.54506: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882497.57231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882497.57350: stderr chunk (state=3): >>><<< 15627 1726882497.57357: stdout chunk (state=3): >>><<< 15627 1726882497.57359: done transferring module to remote 15627 1726882497.57362: _low_level_execute_command(): starting 15627 1726882497.57371: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882497.472875-17207-277811064115039/ /root/.ansible/tmp/ansible-tmp-1726882497.472875-17207-277811064115039/AnsiballZ_setup.py && sleep 0' 15627 1726882497.57943: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882497.57960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882497.57977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882497.57994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882497.58036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882497.58047: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882497.58065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882497.58083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882497.58095: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882497.58105: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882497.58116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882497.58128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882497.58142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882497.58153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882497.58168: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882497.58184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882497.58259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882497.58280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882497.58294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882497.58480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882497.60137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882497.60209: stderr chunk (state=3): >>><<< 15627 1726882497.60218: stdout chunk (state=3): >>><<< 15627 1726882497.60319: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882497.60322: _low_level_execute_command(): starting 15627 1726882497.60326: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882497.472875-17207-277811064115039/AnsiballZ_setup.py && sleep 0' 15627 1726882497.60918: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882497.60932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882497.60947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882497.60969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882497.61017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882497.61029: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882497.61044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882497.61061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882497.61077: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882497.61089: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882497.61104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882497.61119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882497.61135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882497.61148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882497.61159: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882497.61177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882497.61255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882497.61280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882497.61297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882497.61429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882498.12551: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ<<< 15627 1726882498.12604: stdout chunk (state=3): >>>/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.5, "5m": 0.39, "15m": 0.2}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "57", "epoch": "1726882497", "epoch_int": "1726882497", "date": "2024-09-20", "time": "21:34:57", "iso8601_micro": "2024-09-21T01:34:57.865694Z", "iso8601": "2024-09-21T01:34:57Z", "iso8601_basic": "20240920T213457865694", "iso8601_basic_short": "20240920T213457", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "Ge<<< 15627 1726882498.12645: stdout chunk (state=3): >>>nuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2806, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 726, "free": 2806}, "nocache": {"free": 3267, "used": 265}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 655, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241389568, "block_size": 4096, "block_total": 65519355, "block_available": 64512058, "block_used": 1007297, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15627 1726882498.14287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882498.14366: stderr chunk (state=3): >>><<< 15627 1726882498.14369: stdout chunk (state=3): >>><<< 15627 1726882498.14428: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.5, "5m": 0.39, "15m": 0.2}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "57", "epoch": "1726882497", "epoch_int": "1726882497", "date": "2024-09-20", "time": "21:34:57", "iso8601_micro": "2024-09-21T01:34:57.865694Z", "iso8601": "2024-09-21T01:34:57Z", "iso8601_basic": "20240920T213457865694", "iso8601_basic_short": "20240920T213457", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2806, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 726, "free": 2806}, "nocache": {"free": 3267, "used": 265}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 655, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241389568, "block_size": 4096, "block_total": 65519355, "block_available": 64512058, "block_used": 1007297, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882498.14862: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882497.472875-17207-277811064115039/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882498.14890: _low_level_execute_command(): starting 15627 1726882498.14902: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882497.472875-17207-277811064115039/ > /dev/null 2>&1 && sleep 0' 15627 1726882498.15608: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882498.15623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.15640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882498.15666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882498.15713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882498.15726: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882498.15739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882498.15758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882498.15773: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882498.15785: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882498.15801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.15815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882498.15829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882498.15840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882498.15849: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882498.15868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882498.15946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882498.15974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882498.15990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882498.16169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882498.17925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882498.17989: stderr chunk (state=3): >>><<< 15627 1726882498.17996: stdout chunk (state=3): >>><<< 15627 1726882498.18062: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882498.18068: handler run complete 15627 1726882498.18169: variable 'ansible_facts' from source: unknown 15627 1726882498.18300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882498.18789: variable 'ansible_facts' from source: unknown 15627 1726882498.18885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882498.19027: attempt loop complete, returning result 15627 1726882498.19042: _execute() done 15627 1726882498.19050: dumping result to json 15627 1726882498.19092: done dumping result, returning 15627 1726882498.19105: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-2847-7723-00000000046e] 15627 1726882498.19113: sending task result for task 0e448fcc-3ce9-2847-7723-00000000046e ok: [managed_node1] 15627 1726882498.19934: no more pending results, returning what we have 15627 1726882498.19937: results queue empty 15627 1726882498.19938: checking for any_errors_fatal 15627 1726882498.19940: done checking for any_errors_fatal 15627 1726882498.19940: checking for max_fail_percentage 15627 1726882498.19942: done checking for max_fail_percentage 15627 1726882498.19943: checking to see if all hosts have failed and the running result is not ok 15627 1726882498.19944: done checking to see if all hosts have failed 15627 1726882498.19945: getting the remaining hosts for this loop 15627 1726882498.19946: done getting the remaining hosts for this loop 15627 1726882498.19950: getting the next task for host managed_node1 15627 1726882498.19957: done getting next task for host managed_node1 15627 1726882498.19959: ^ task is: TASK: meta (flush_handlers) 15627 1726882498.19961: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882498.19967: getting variables 15627 1726882498.19968: in VariableManager get_vars() 15627 1726882498.20000: Calling all_inventory to load vars for managed_node1 15627 1726882498.20003: Calling groups_inventory to load vars for managed_node1 15627 1726882498.20007: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882498.20019: Calling all_plugins_play to load vars for managed_node1 15627 1726882498.20023: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882498.20026: Calling groups_plugins_play to load vars for managed_node1 15627 1726882498.20987: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000046e 15627 1726882498.20993: WORKER PROCESS EXITING 15627 1726882498.21743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882498.22923: done with get_vars() 15627 1726882498.22951: done getting variables 15627 1726882498.23056: in VariableManager get_vars() 15627 1726882498.23073: Calling all_inventory to load vars for managed_node1 15627 1726882498.23079: Calling groups_inventory to load vars for managed_node1 15627 1726882498.23082: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882498.23088: Calling all_plugins_play to load vars for managed_node1 15627 1726882498.23091: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882498.23104: Calling groups_plugins_play to load vars for managed_node1 15627 1726882498.24414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882498.26329: done with get_vars() 15627 1726882498.26359: done queuing things up, now waiting for results queue to drain 15627 1726882498.26362: results queue empty 15627 1726882498.26362: checking for any_errors_fatal 15627 1726882498.26369: done checking for any_errors_fatal 15627 1726882498.26371: checking for max_fail_percentage 15627 1726882498.26374: done checking for max_fail_percentage 15627 1726882498.26374: checking to see if all hosts have failed and the running result is not ok 15627 1726882498.26375: done checking to see if all hosts have failed 15627 1726882498.26376: getting the remaining hosts for this loop 15627 1726882498.26377: done getting the remaining hosts for this loop 15627 1726882498.26380: getting the next task for host managed_node1 15627 1726882498.26384: done getting next task for host managed_node1 15627 1726882498.26387: ^ task is: TASK: Include the task '{{ task }}' 15627 1726882498.26389: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882498.26391: getting variables 15627 1726882498.26392: in VariableManager get_vars() 15627 1726882498.26404: Calling all_inventory to load vars for managed_node1 15627 1726882498.26407: Calling groups_inventory to load vars for managed_node1 15627 1726882498.26409: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882498.26414: Calling all_plugins_play to load vars for managed_node1 15627 1726882498.26416: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882498.26419: Calling groups_plugins_play to load vars for managed_node1 15627 1726882498.27327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882498.28508: done with get_vars() 15627 1726882498.28528: done getting variables 15627 1726882498.28685: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_absent.yml'] ********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 21:34:58 -0400 (0:00:00.869) 0:00:38.038 ****** 15627 1726882498.28709: entering _queue_task() for managed_node1/include_tasks 15627 1726882498.28993: worker is 1 (out of 1 available) 15627 1726882498.29006: exiting _queue_task() for managed_node1/include_tasks 15627 1726882498.29017: done queuing things up, now waiting for results queue to drain 15627 1726882498.29018: waiting for pending results... 15627 1726882498.29208: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_absent.yml' 15627 1726882498.29297: in run() - task 0e448fcc-3ce9-2847-7723-000000000073 15627 1726882498.29320: variable 'ansible_search_path' from source: unknown 15627 1726882498.29359: calling self._execute() 15627 1726882498.29447: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882498.29468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882498.29486: variable 'omit' from source: magic vars 15627 1726882498.29875: variable 'ansible_distribution_major_version' from source: facts 15627 1726882498.29897: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882498.29913: variable 'task' from source: play vars 15627 1726882498.29995: variable 'task' from source: play vars 15627 1726882498.30015: _execute() done 15627 1726882498.30025: dumping result to json 15627 1726882498.30039: done dumping result, returning 15627 1726882498.30055: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_absent.yml' [0e448fcc-3ce9-2847-7723-000000000073] 15627 1726882498.30070: sending task result for task 0e448fcc-3ce9-2847-7723-000000000073 15627 1726882498.30195: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000073 15627 1726882498.30209: WORKER PROCESS EXITING 15627 1726882498.30287: no more pending results, returning what we have 15627 1726882498.30292: in VariableManager get_vars() 15627 1726882498.30491: Calling all_inventory to load vars for managed_node1 15627 1726882498.30493: Calling groups_inventory to load vars for managed_node1 15627 1726882498.30497: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882498.30505: Calling all_plugins_play to load vars for managed_node1 15627 1726882498.30507: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882498.30509: Calling groups_plugins_play to load vars for managed_node1 15627 1726882498.31838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882498.32931: done with get_vars() 15627 1726882498.32943: variable 'ansible_search_path' from source: unknown 15627 1726882498.32953: we have included files to process 15627 1726882498.32954: generating all_blocks data 15627 1726882498.32955: done generating all_blocks data 15627 1726882498.32956: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15627 1726882498.32957: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15627 1726882498.32959: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15627 1726882498.33070: in VariableManager get_vars() 15627 1726882498.33080: done with get_vars() 15627 1726882498.33152: done processing included file 15627 1726882498.33154: iterating over new_blocks loaded from include file 15627 1726882498.33155: in VariableManager get_vars() 15627 1726882498.33163: done with get_vars() 15627 1726882498.33166: filtering new block on tags 15627 1726882498.33178: done filtering new block on tags 15627 1726882498.33180: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 15627 1726882498.33183: extending task lists for all hosts with included blocks 15627 1726882498.33201: done extending task lists 15627 1726882498.33202: done processing included files 15627 1726882498.33202: results queue empty 15627 1726882498.33203: checking for any_errors_fatal 15627 1726882498.33203: done checking for any_errors_fatal 15627 1726882498.33204: checking for max_fail_percentage 15627 1726882498.33205: done checking for max_fail_percentage 15627 1726882498.33205: checking to see if all hosts have failed and the running result is not ok 15627 1726882498.33206: done checking to see if all hosts have failed 15627 1726882498.33206: getting the remaining hosts for this loop 15627 1726882498.33207: done getting the remaining hosts for this loop 15627 1726882498.33208: getting the next task for host managed_node1 15627 1726882498.33211: done getting next task for host managed_node1 15627 1726882498.33212: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15627 1726882498.33214: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882498.33215: getting variables 15627 1726882498.33216: in VariableManager get_vars() 15627 1726882498.33221: Calling all_inventory to load vars for managed_node1 15627 1726882498.33222: Calling groups_inventory to load vars for managed_node1 15627 1726882498.33224: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882498.33227: Calling all_plugins_play to load vars for managed_node1 15627 1726882498.33228: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882498.33230: Calling groups_plugins_play to load vars for managed_node1 15627 1726882498.34154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882498.35947: done with get_vars() 15627 1726882498.35981: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:34:58 -0400 (0:00:00.073) 0:00:38.112 ****** 15627 1726882498.36076: entering _queue_task() for managed_node1/include_tasks 15627 1726882498.36461: worker is 1 (out of 1 available) 15627 1726882498.36479: exiting _queue_task() for managed_node1/include_tasks 15627 1726882498.36493: done queuing things up, now waiting for results queue to drain 15627 1726882498.36494: waiting for pending results... 15627 1726882498.36770: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 15627 1726882498.36907: in run() - task 0e448fcc-3ce9-2847-7723-00000000047f 15627 1726882498.36930: variable 'ansible_search_path' from source: unknown 15627 1726882498.36939: variable 'ansible_search_path' from source: unknown 15627 1726882498.36982: calling self._execute() 15627 1726882498.37082: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882498.37093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882498.37111: variable 'omit' from source: magic vars 15627 1726882498.37499: variable 'ansible_distribution_major_version' from source: facts 15627 1726882498.37522: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882498.37541: _execute() done 15627 1726882498.37550: dumping result to json 15627 1726882498.37560: done dumping result, returning 15627 1726882498.37578: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-2847-7723-00000000047f] 15627 1726882498.37589: sending task result for task 0e448fcc-3ce9-2847-7723-00000000047f 15627 1726882498.37707: no more pending results, returning what we have 15627 1726882498.37715: in VariableManager get_vars() 15627 1726882498.37748: Calling all_inventory to load vars for managed_node1 15627 1726882498.37751: Calling groups_inventory to load vars for managed_node1 15627 1726882498.37755: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882498.37770: Calling all_plugins_play to load vars for managed_node1 15627 1726882498.37773: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882498.37776: Calling groups_plugins_play to load vars for managed_node1 15627 1726882498.39098: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000047f 15627 1726882498.39105: WORKER PROCESS EXITING 15627 1726882498.39599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882498.41320: done with get_vars() 15627 1726882498.41342: variable 'ansible_search_path' from source: unknown 15627 1726882498.41343: variable 'ansible_search_path' from source: unknown 15627 1726882498.41352: variable 'task' from source: play vars 15627 1726882498.41456: variable 'task' from source: play vars 15627 1726882498.41484: we have included files to process 15627 1726882498.41485: generating all_blocks data 15627 1726882498.41486: done generating all_blocks data 15627 1726882498.41487: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15627 1726882498.41488: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15627 1726882498.41489: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15627 1726882498.42185: done processing included file 15627 1726882498.42187: iterating over new_blocks loaded from include file 15627 1726882498.42188: in VariableManager get_vars() 15627 1726882498.42197: done with get_vars() 15627 1726882498.42198: filtering new block on tags 15627 1726882498.42213: done filtering new block on tags 15627 1726882498.42215: in VariableManager get_vars() 15627 1726882498.42221: done with get_vars() 15627 1726882498.42222: filtering new block on tags 15627 1726882498.42234: done filtering new block on tags 15627 1726882498.42235: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 15627 1726882498.42238: extending task lists for all hosts with included blocks 15627 1726882498.42341: done extending task lists 15627 1726882498.42342: done processing included files 15627 1726882498.42343: results queue empty 15627 1726882498.42344: checking for any_errors_fatal 15627 1726882498.42348: done checking for any_errors_fatal 15627 1726882498.42349: checking for max_fail_percentage 15627 1726882498.42350: done checking for max_fail_percentage 15627 1726882498.42351: checking to see if all hosts have failed and the running result is not ok 15627 1726882498.42351: done checking to see if all hosts have failed 15627 1726882498.42352: getting the remaining hosts for this loop 15627 1726882498.42353: done getting the remaining hosts for this loop 15627 1726882498.42356: getting the next task for host managed_node1 15627 1726882498.42360: done getting next task for host managed_node1 15627 1726882498.42361: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15627 1726882498.42365: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882498.42367: getting variables 15627 1726882498.42368: in VariableManager get_vars() 15627 1726882498.42380: Calling all_inventory to load vars for managed_node1 15627 1726882498.42383: Calling groups_inventory to load vars for managed_node1 15627 1726882498.42385: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882498.42391: Calling all_plugins_play to load vars for managed_node1 15627 1726882498.42393: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882498.42396: Calling groups_plugins_play to load vars for managed_node1 15627 1726882498.43435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882498.45147: done with get_vars() 15627 1726882498.45183: done getting variables 15627 1726882498.45245: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:34:58 -0400 (0:00:00.092) 0:00:38.204 ****** 15627 1726882498.45285: entering _queue_task() for managed_node1/set_fact 15627 1726882498.45639: worker is 1 (out of 1 available) 15627 1726882498.45656: exiting _queue_task() for managed_node1/set_fact 15627 1726882498.45671: done queuing things up, now waiting for results queue to drain 15627 1726882498.45673: waiting for pending results... 15627 1726882498.45989: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 15627 1726882498.46161: in run() - task 0e448fcc-3ce9-2847-7723-00000000048a 15627 1726882498.46188: variable 'ansible_search_path' from source: unknown 15627 1726882498.46200: variable 'ansible_search_path' from source: unknown 15627 1726882498.46255: calling self._execute() 15627 1726882498.46375: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882498.46387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882498.46402: variable 'omit' from source: magic vars 15627 1726882498.46819: variable 'ansible_distribution_major_version' from source: facts 15627 1726882498.46838: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882498.46860: variable 'omit' from source: magic vars 15627 1726882498.46928: variable 'omit' from source: magic vars 15627 1726882498.46980: variable 'omit' from source: magic vars 15627 1726882498.47041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882498.47092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882498.47125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882498.47148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882498.47171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882498.47229: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882498.47242: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882498.47251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882498.47384: Set connection var ansible_timeout to 10 15627 1726882498.47406: Set connection var ansible_shell_executable to /bin/sh 15627 1726882498.47417: Set connection var ansible_connection to ssh 15627 1726882498.47435: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882498.47446: Set connection var ansible_pipelining to False 15627 1726882498.47456: Set connection var ansible_shell_type to sh 15627 1726882498.47497: variable 'ansible_shell_executable' from source: unknown 15627 1726882498.47508: variable 'ansible_connection' from source: unknown 15627 1726882498.47517: variable 'ansible_module_compression' from source: unknown 15627 1726882498.47524: variable 'ansible_shell_type' from source: unknown 15627 1726882498.47532: variable 'ansible_shell_executable' from source: unknown 15627 1726882498.47544: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882498.47561: variable 'ansible_pipelining' from source: unknown 15627 1726882498.47580: variable 'ansible_timeout' from source: unknown 15627 1726882498.47590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882498.47750: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882498.47773: variable 'omit' from source: magic vars 15627 1726882498.47784: starting attempt loop 15627 1726882498.47792: running the handler 15627 1726882498.47810: handler run complete 15627 1726882498.47833: attempt loop complete, returning result 15627 1726882498.47842: _execute() done 15627 1726882498.47848: dumping result to json 15627 1726882498.47858: done dumping result, returning 15627 1726882498.47877: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-2847-7723-00000000048a] 15627 1726882498.47888: sending task result for task 0e448fcc-3ce9-2847-7723-00000000048a ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15627 1726882498.48061: no more pending results, returning what we have 15627 1726882498.48073: results queue empty 15627 1726882498.48076: checking for any_errors_fatal 15627 1726882498.48078: done checking for any_errors_fatal 15627 1726882498.48079: checking for max_fail_percentage 15627 1726882498.48081: done checking for max_fail_percentage 15627 1726882498.48082: checking to see if all hosts have failed and the running result is not ok 15627 1726882498.48083: done checking to see if all hosts have failed 15627 1726882498.48084: getting the remaining hosts for this loop 15627 1726882498.48085: done getting the remaining hosts for this loop 15627 1726882498.48090: getting the next task for host managed_node1 15627 1726882498.48100: done getting next task for host managed_node1 15627 1726882498.48102: ^ task is: TASK: Stat profile file 15627 1726882498.48107: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882498.48114: getting variables 15627 1726882498.48116: in VariableManager get_vars() 15627 1726882498.48156: Calling all_inventory to load vars for managed_node1 15627 1726882498.48159: Calling groups_inventory to load vars for managed_node1 15627 1726882498.48167: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882498.48183: Calling all_plugins_play to load vars for managed_node1 15627 1726882498.48188: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882498.48191: Calling groups_plugins_play to load vars for managed_node1 15627 1726882498.49182: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000048a 15627 1726882498.49186: WORKER PROCESS EXITING 15627 1726882498.50149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882498.51691: done with get_vars() 15627 1726882498.51709: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:34:58 -0400 (0:00:00.065) 0:00:38.269 ****** 15627 1726882498.51783: entering _queue_task() for managed_node1/stat 15627 1726882498.52110: worker is 1 (out of 1 available) 15627 1726882498.52127: exiting _queue_task() for managed_node1/stat 15627 1726882498.52144: done queuing things up, now waiting for results queue to drain 15627 1726882498.52146: waiting for pending results... 15627 1726882498.52589: running TaskExecutor() for managed_node1/TASK: Stat profile file 15627 1726882498.52684: in run() - task 0e448fcc-3ce9-2847-7723-00000000048b 15627 1726882498.52697: variable 'ansible_search_path' from source: unknown 15627 1726882498.52702: variable 'ansible_search_path' from source: unknown 15627 1726882498.52793: calling self._execute() 15627 1726882498.52910: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882498.52914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882498.52923: variable 'omit' from source: magic vars 15627 1726882498.54399: variable 'ansible_distribution_major_version' from source: facts 15627 1726882498.54599: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882498.54611: variable 'omit' from source: magic vars 15627 1726882498.54670: variable 'omit' from source: magic vars 15627 1726882498.54779: variable 'profile' from source: play vars 15627 1726882498.54800: variable 'interface' from source: set_fact 15627 1726882498.54870: variable 'interface' from source: set_fact 15627 1726882498.54894: variable 'omit' from source: magic vars 15627 1726882498.54958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882498.55044: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882498.55088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882498.55098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882498.55111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882498.55153: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882498.55159: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882498.55161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882498.55247: Set connection var ansible_timeout to 10 15627 1726882498.55260: Set connection var ansible_shell_executable to /bin/sh 15627 1726882498.55267: Set connection var ansible_connection to ssh 15627 1726882498.55272: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882498.55277: Set connection var ansible_pipelining to False 15627 1726882498.55280: Set connection var ansible_shell_type to sh 15627 1726882498.55300: variable 'ansible_shell_executable' from source: unknown 15627 1726882498.55303: variable 'ansible_connection' from source: unknown 15627 1726882498.55306: variable 'ansible_module_compression' from source: unknown 15627 1726882498.55308: variable 'ansible_shell_type' from source: unknown 15627 1726882498.55311: variable 'ansible_shell_executable' from source: unknown 15627 1726882498.55313: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882498.55316: variable 'ansible_pipelining' from source: unknown 15627 1726882498.55318: variable 'ansible_timeout' from source: unknown 15627 1726882498.55323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882498.55485: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882498.55494: variable 'omit' from source: magic vars 15627 1726882498.55500: starting attempt loop 15627 1726882498.55503: running the handler 15627 1726882498.55514: _low_level_execute_command(): starting 15627 1726882498.55520: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882498.56015: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882498.56031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882498.56045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.56065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882498.56104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882498.56128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882498.56221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882498.57902: stdout chunk (state=3): >>>/root <<< 15627 1726882498.58585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882498.58661: stderr chunk (state=3): >>><<< 15627 1726882498.58667: stdout chunk (state=3): >>><<< 15627 1726882498.58784: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882498.58788: _low_level_execute_command(): starting 15627 1726882498.58790: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882498.586875-17250-31862204866055 `" && echo ansible-tmp-1726882498.586875-17250-31862204866055="` echo /root/.ansible/tmp/ansible-tmp-1726882498.586875-17250-31862204866055 `" ) && sleep 0' 15627 1726882498.60041: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.60045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882498.60076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882498.60089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15627 1726882498.60286: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.60289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882498.60366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882498.60373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882498.60384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882498.60690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882498.62569: stdout chunk (state=3): >>>ansible-tmp-1726882498.586875-17250-31862204866055=/root/.ansible/tmp/ansible-tmp-1726882498.586875-17250-31862204866055 <<< 15627 1726882498.62673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882498.62818: stderr chunk (state=3): >>><<< 15627 1726882498.62821: stdout chunk (state=3): >>><<< 15627 1726882498.63075: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882498.586875-17250-31862204866055=/root/.ansible/tmp/ansible-tmp-1726882498.586875-17250-31862204866055 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882498.63079: variable 'ansible_module_compression' from source: unknown 15627 1726882498.63082: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15627 1726882498.63084: variable 'ansible_facts' from source: unknown 15627 1726882498.63101: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882498.586875-17250-31862204866055/AnsiballZ_stat.py 15627 1726882498.63791: Sending initial data 15627 1726882498.63794: Sent initial data (151 bytes) 15627 1726882498.66321: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882498.66477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.66494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882498.66514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882498.66561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882498.66584: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882498.66600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882498.66619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882498.66632: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882498.66644: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882498.66657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.66682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882498.66704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882498.66718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882498.66730: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882498.66751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882498.66845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882498.66922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882498.66939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882498.67128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882498.68846: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882498.68933: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882498.69027: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpac02_e74 /root/.ansible/tmp/ansible-tmp-1726882498.586875-17250-31862204866055/AnsiballZ_stat.py <<< 15627 1726882498.69116: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882498.70736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882498.70859: stderr chunk (state=3): >>><<< 15627 1726882498.70862: stdout chunk (state=3): >>><<< 15627 1726882498.70866: done transferring module to remote 15627 1726882498.70868: _low_level_execute_command(): starting 15627 1726882498.70870: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882498.586875-17250-31862204866055/ /root/.ansible/tmp/ansible-tmp-1726882498.586875-17250-31862204866055/AnsiballZ_stat.py && sleep 0' 15627 1726882498.72308: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882498.72350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.72397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882498.72415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882498.72457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882498.72502: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882498.72518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882498.72539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882498.72594: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882498.72613: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882498.72625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.72638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882498.72653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882498.72671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882498.72684: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882498.72698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882498.72783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882498.72896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882498.72913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882498.73066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882498.74860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882498.74865: stdout chunk (state=3): >>><<< 15627 1726882498.74868: stderr chunk (state=3): >>><<< 15627 1726882498.74960: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882498.74966: _low_level_execute_command(): starting 15627 1726882498.74969: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882498.586875-17250-31862204866055/AnsiballZ_stat.py && sleep 0' 15627 1726882498.75680: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882498.75695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.75715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882498.75733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882498.75779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882498.75790: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882498.75803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882498.75826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882498.75839: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882498.75849: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882498.75860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.75877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882498.75891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882498.75907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882498.75917: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882498.75930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882498.76014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882498.76036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882498.76050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882498.76187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882498.89193: stdout chunk (state=3): >>> <<< 15627 1726882498.89199: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15627 1726882498.90211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882498.90277: stderr chunk (state=3): >>><<< 15627 1726882498.90292: stdout chunk (state=3): >>><<< 15627 1726882498.90336: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882498.90462: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882498.586875-17250-31862204866055/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882498.90473: _low_level_execute_command(): starting 15627 1726882498.90475: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882498.586875-17250-31862204866055/ > /dev/null 2>&1 && sleep 0' 15627 1726882498.92141: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882498.92159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.92201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882498.92221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882498.92268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882498.92306: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882498.92320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882498.92336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882498.92347: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882498.92357: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882498.92372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882498.92385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882498.92401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882498.92414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882498.92424: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882498.92440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882498.92524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882498.92546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882498.92560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882498.92688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882498.94505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882498.94695: stderr chunk (state=3): >>><<< 15627 1726882498.94732: stdout chunk (state=3): >>><<< 15627 1726882498.94770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882498.94810: handler run complete 15627 1726882498.94814: attempt loop complete, returning result 15627 1726882498.94830: _execute() done 15627 1726882498.94890: dumping result to json 15627 1726882498.95078: done dumping result, returning 15627 1726882498.95081: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0e448fcc-3ce9-2847-7723-00000000048b] 15627 1726882498.95083: sending task result for task 0e448fcc-3ce9-2847-7723-00000000048b 15627 1726882498.95169: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000048b 15627 1726882498.95173: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15627 1726882498.95236: no more pending results, returning what we have 15627 1726882498.95240: results queue empty 15627 1726882498.95241: checking for any_errors_fatal 15627 1726882498.95249: done checking for any_errors_fatal 15627 1726882498.95250: checking for max_fail_percentage 15627 1726882498.95252: done checking for max_fail_percentage 15627 1726882498.95253: checking to see if all hosts have failed and the running result is not ok 15627 1726882498.95254: done checking to see if all hosts have failed 15627 1726882498.95255: getting the remaining hosts for this loop 15627 1726882498.95257: done getting the remaining hosts for this loop 15627 1726882498.95261: getting the next task for host managed_node1 15627 1726882498.95271: done getting next task for host managed_node1 15627 1726882498.95274: ^ task is: TASK: Set NM profile exist flag based on the profile files 15627 1726882498.95279: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882498.95282: getting variables 15627 1726882498.95284: in VariableManager get_vars() 15627 1726882498.95384: Calling all_inventory to load vars for managed_node1 15627 1726882498.95387: Calling groups_inventory to load vars for managed_node1 15627 1726882498.95391: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882498.95448: Calling all_plugins_play to load vars for managed_node1 15627 1726882498.95453: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882498.95457: Calling groups_plugins_play to load vars for managed_node1 15627 1726882498.99708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882499.01631: done with get_vars() 15627 1726882499.01658: done getting variables 15627 1726882499.01754: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:34:59 -0400 (0:00:00.500) 0:00:38.769 ****** 15627 1726882499.01832: entering _queue_task() for managed_node1/set_fact 15627 1726882499.02182: worker is 1 (out of 1 available) 15627 1726882499.02196: exiting _queue_task() for managed_node1/set_fact 15627 1726882499.02208: done queuing things up, now waiting for results queue to drain 15627 1726882499.02209: waiting for pending results... 15627 1726882499.02501: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 15627 1726882499.02637: in run() - task 0e448fcc-3ce9-2847-7723-00000000048c 15627 1726882499.02670: variable 'ansible_search_path' from source: unknown 15627 1726882499.02681: variable 'ansible_search_path' from source: unknown 15627 1726882499.02720: calling self._execute() 15627 1726882499.02815: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882499.02825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882499.02839: variable 'omit' from source: magic vars 15627 1726882499.03236: variable 'ansible_distribution_major_version' from source: facts 15627 1726882499.03252: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882499.03397: variable 'profile_stat' from source: set_fact 15627 1726882499.03420: Evaluated conditional (profile_stat.stat.exists): False 15627 1726882499.03432: when evaluation is False, skipping this task 15627 1726882499.03442: _execute() done 15627 1726882499.03448: dumping result to json 15627 1726882499.03455: done dumping result, returning 15627 1726882499.03466: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-2847-7723-00000000048c] 15627 1726882499.03477: sending task result for task 0e448fcc-3ce9-2847-7723-00000000048c skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15627 1726882499.03658: no more pending results, returning what we have 15627 1726882499.03662: results queue empty 15627 1726882499.03665: checking for any_errors_fatal 15627 1726882499.03676: done checking for any_errors_fatal 15627 1726882499.03676: checking for max_fail_percentage 15627 1726882499.03678: done checking for max_fail_percentage 15627 1726882499.03679: checking to see if all hosts have failed and the running result is not ok 15627 1726882499.03680: done checking to see if all hosts have failed 15627 1726882499.03681: getting the remaining hosts for this loop 15627 1726882499.03683: done getting the remaining hosts for this loop 15627 1726882499.03687: getting the next task for host managed_node1 15627 1726882499.03696: done getting next task for host managed_node1 15627 1726882499.03698: ^ task is: TASK: Get NM profile info 15627 1726882499.03703: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882499.03707: getting variables 15627 1726882499.03709: in VariableManager get_vars() 15627 1726882499.03738: Calling all_inventory to load vars for managed_node1 15627 1726882499.03742: Calling groups_inventory to load vars for managed_node1 15627 1726882499.03746: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882499.03759: Calling all_plugins_play to load vars for managed_node1 15627 1726882499.03765: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882499.03769: Calling groups_plugins_play to load vars for managed_node1 15627 1726882499.05563: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000048c 15627 1726882499.05567: WORKER PROCESS EXITING 15627 1726882499.06771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882499.08631: done with get_vars() 15627 1726882499.08656: done getting variables 15627 1726882499.08718: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:34:59 -0400 (0:00:00.069) 0:00:38.839 ****** 15627 1726882499.08756: entering _queue_task() for managed_node1/shell 15627 1726882499.09091: worker is 1 (out of 1 available) 15627 1726882499.09104: exiting _queue_task() for managed_node1/shell 15627 1726882499.09117: done queuing things up, now waiting for results queue to drain 15627 1726882499.09118: waiting for pending results... 15627 1726882499.09444: running TaskExecutor() for managed_node1/TASK: Get NM profile info 15627 1726882499.09604: in run() - task 0e448fcc-3ce9-2847-7723-00000000048d 15627 1726882499.09657: variable 'ansible_search_path' from source: unknown 15627 1726882499.09667: variable 'ansible_search_path' from source: unknown 15627 1726882499.09714: calling self._execute() 15627 1726882499.09812: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882499.09822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882499.09844: variable 'omit' from source: magic vars 15627 1726882499.10249: variable 'ansible_distribution_major_version' from source: facts 15627 1726882499.10271: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882499.10311: variable 'omit' from source: magic vars 15627 1726882499.10368: variable 'omit' from source: magic vars 15627 1726882499.11257: variable 'profile' from source: play vars 15627 1726882499.11324: variable 'interface' from source: set_fact 15627 1726882499.11390: variable 'interface' from source: set_fact 15627 1726882499.11556: variable 'omit' from source: magic vars 15627 1726882499.11602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882499.11673: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882499.11698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882499.11772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882499.11790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882499.11823: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882499.11922: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882499.11931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882499.12046: Set connection var ansible_timeout to 10 15627 1726882499.12066: Set connection var ansible_shell_executable to /bin/sh 15627 1726882499.12083: Set connection var ansible_connection to ssh 15627 1726882499.12093: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882499.12103: Set connection var ansible_pipelining to False 15627 1726882499.12110: Set connection var ansible_shell_type to sh 15627 1726882499.12158: variable 'ansible_shell_executable' from source: unknown 15627 1726882499.12173: variable 'ansible_connection' from source: unknown 15627 1726882499.12182: variable 'ansible_module_compression' from source: unknown 15627 1726882499.12195: variable 'ansible_shell_type' from source: unknown 15627 1726882499.12202: variable 'ansible_shell_executable' from source: unknown 15627 1726882499.12209: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882499.12215: variable 'ansible_pipelining' from source: unknown 15627 1726882499.12222: variable 'ansible_timeout' from source: unknown 15627 1726882499.12229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882499.12447: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882499.12466: variable 'omit' from source: magic vars 15627 1726882499.12476: starting attempt loop 15627 1726882499.12483: running the handler 15627 1726882499.12512: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882499.12549: _low_level_execute_command(): starting 15627 1726882499.12562: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882499.13533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882499.13537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882499.13572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882499.13575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882499.13578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882499.13580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882499.13648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882499.13651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882499.13667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882499.13796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882499.15514: stdout chunk (state=3): >>>/root <<< 15627 1726882499.15704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882499.15708: stdout chunk (state=3): >>><<< 15627 1726882499.15711: stderr chunk (state=3): >>><<< 15627 1726882499.15826: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882499.15831: _low_level_execute_command(): starting 15627 1726882499.15834: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882499.1573007-17278-265636248964408 `" && echo ansible-tmp-1726882499.1573007-17278-265636248964408="` echo /root/.ansible/tmp/ansible-tmp-1726882499.1573007-17278-265636248964408 `" ) && sleep 0' 15627 1726882499.16414: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882499.16426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882499.16438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882499.16452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882499.16499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882499.16509: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882499.16520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882499.16537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882499.16549: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882499.16560: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882499.16573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882499.16592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882499.16606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882499.16616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882499.16626: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882499.16637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882499.16721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882499.16741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882499.16754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882499.16879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882499.18809: stdout chunk (state=3): >>>ansible-tmp-1726882499.1573007-17278-265636248964408=/root/.ansible/tmp/ansible-tmp-1726882499.1573007-17278-265636248964408 <<< 15627 1726882499.18997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882499.19001: stdout chunk (state=3): >>><<< 15627 1726882499.19004: stderr chunk (state=3): >>><<< 15627 1726882499.19050: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882499.1573007-17278-265636248964408=/root/.ansible/tmp/ansible-tmp-1726882499.1573007-17278-265636248964408 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882499.19270: variable 'ansible_module_compression' from source: unknown 15627 1726882499.19274: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15627 1726882499.19276: variable 'ansible_facts' from source: unknown 15627 1726882499.19278: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882499.1573007-17278-265636248964408/AnsiballZ_command.py 15627 1726882499.19624: Sending initial data 15627 1726882499.19627: Sent initial data (156 bytes) 15627 1726882499.20553: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882499.20572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882499.20592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882499.20630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882499.20679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882499.20696: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882499.20729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882499.20748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882499.20761: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882499.20777: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882499.20790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882499.20806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882499.20845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882499.20857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882499.20869: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882499.20882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882499.20975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882499.21000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882499.21317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882499.22452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882499.24221: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882499.24316: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882499.24417: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmph6minkcj /root/.ansible/tmp/ansible-tmp-1726882499.1573007-17278-265636248964408/AnsiballZ_command.py <<< 15627 1726882499.24540: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882499.25899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882499.26200: stderr chunk (state=3): >>><<< 15627 1726882499.26203: stdout chunk (state=3): >>><<< 15627 1726882499.26225: done transferring module to remote 15627 1726882499.26236: _low_level_execute_command(): starting 15627 1726882499.26241: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882499.1573007-17278-265636248964408/ /root/.ansible/tmp/ansible-tmp-1726882499.1573007-17278-265636248964408/AnsiballZ_command.py && sleep 0' 15627 1726882499.27743: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882499.27756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882499.27759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882499.27775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882499.27809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882499.27816: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882499.27826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882499.27840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882499.27847: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882499.27857: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882499.27860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882499.27872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882499.27884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882499.27892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882499.27898: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882499.27907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882499.27988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882499.27995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882499.27998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882499.28124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882499.30004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882499.30007: stdout chunk (state=3): >>><<< 15627 1726882499.30010: stderr chunk (state=3): >>><<< 15627 1726882499.30128: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882499.30133: _low_level_execute_command(): starting 15627 1726882499.30139: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882499.1573007-17278-265636248964408/AnsiballZ_command.py && sleep 0' 15627 1726882499.32034: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882499.32101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882499.32232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882499.32236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882499.32346: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882499.32447: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882499.32453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882499.32457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882499.32460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882499.32601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882499.32647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882499.32797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882499.47665: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 21:34:59.457717", "end": "2024-09-20 21:34:59.475125", "delta": "0:00:00.017408", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15627 1726882499.48862: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.44.90 closed. <<< 15627 1726882499.48964: stderr chunk (state=3): >>><<< 15627 1726882499.48968: stdout chunk (state=3): >>><<< 15627 1726882499.48990: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 21:34:59.457717", "end": "2024-09-20 21:34:59.475125", "delta": "0:00:00.017408", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.44.90 closed. 15627 1726882499.49114: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882499.1573007-17278-265636248964408/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882499.49125: _low_level_execute_command(): starting 15627 1726882499.49128: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882499.1573007-17278-265636248964408/ > /dev/null 2>&1 && sleep 0' 15627 1726882499.50541: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882499.50550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882499.50560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882499.50640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882499.50712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882499.50716: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882499.50751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882499.50766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882499.50781: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882499.50784: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882499.50794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882499.50809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882499.50819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882499.50826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882499.50833: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882499.50841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882499.50920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882499.50942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882499.50945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882499.51074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882499.52882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882499.52959: stderr chunk (state=3): >>><<< 15627 1726882499.52973: stdout chunk (state=3): >>><<< 15627 1726882499.53275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882499.53279: handler run complete 15627 1726882499.53281: Evaluated conditional (False): False 15627 1726882499.53283: attempt loop complete, returning result 15627 1726882499.53285: _execute() done 15627 1726882499.53287: dumping result to json 15627 1726882499.53289: done dumping result, returning 15627 1726882499.53291: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0e448fcc-3ce9-2847-7723-00000000048d] 15627 1726882499.53293: sending task result for task 0e448fcc-3ce9-2847-7723-00000000048d 15627 1726882499.53368: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000048d 15627 1726882499.53371: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.017408", "end": "2024-09-20 21:34:59.475125", "rc": 1, "start": "2024-09-20 21:34:59.457717" } MSG: non-zero return code ...ignoring 15627 1726882499.53458: no more pending results, returning what we have 15627 1726882499.53461: results queue empty 15627 1726882499.53462: checking for any_errors_fatal 15627 1726882499.53476: done checking for any_errors_fatal 15627 1726882499.53476: checking for max_fail_percentage 15627 1726882499.53478: done checking for max_fail_percentage 15627 1726882499.53479: checking to see if all hosts have failed and the running result is not ok 15627 1726882499.53480: done checking to see if all hosts have failed 15627 1726882499.53481: getting the remaining hosts for this loop 15627 1726882499.53483: done getting the remaining hosts for this loop 15627 1726882499.53488: getting the next task for host managed_node1 15627 1726882499.53495: done getting next task for host managed_node1 15627 1726882499.53498: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15627 1726882499.53503: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882499.53506: getting variables 15627 1726882499.53508: in VariableManager get_vars() 15627 1726882499.53539: Calling all_inventory to load vars for managed_node1 15627 1726882499.53542: Calling groups_inventory to load vars for managed_node1 15627 1726882499.53546: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882499.53561: Calling all_plugins_play to load vars for managed_node1 15627 1726882499.53566: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882499.53570: Calling groups_plugins_play to load vars for managed_node1 15627 1726882499.54731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882499.57869: done with get_vars() 15627 1726882499.57897: done getting variables 15627 1726882499.57974: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:34:59 -0400 (0:00:00.492) 0:00:39.331 ****** 15627 1726882499.58007: entering _queue_task() for managed_node1/set_fact 15627 1726882499.58346: worker is 1 (out of 1 available) 15627 1726882499.58359: exiting _queue_task() for managed_node1/set_fact 15627 1726882499.58373: done queuing things up, now waiting for results queue to drain 15627 1726882499.58375: waiting for pending results... 15627 1726882499.59095: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15627 1726882499.59180: in run() - task 0e448fcc-3ce9-2847-7723-00000000048e 15627 1726882499.59196: variable 'ansible_search_path' from source: unknown 15627 1726882499.59200: variable 'ansible_search_path' from source: unknown 15627 1726882499.59235: calling self._execute() 15627 1726882499.59325: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882499.59328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882499.59339: variable 'omit' from source: magic vars 15627 1726882499.59709: variable 'ansible_distribution_major_version' from source: facts 15627 1726882499.59720: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882499.59850: variable 'nm_profile_exists' from source: set_fact 15627 1726882499.59870: Evaluated conditional (nm_profile_exists.rc == 0): False 15627 1726882499.59873: when evaluation is False, skipping this task 15627 1726882499.59876: _execute() done 15627 1726882499.59879: dumping result to json 15627 1726882499.59882: done dumping result, returning 15627 1726882499.59889: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-2847-7723-00000000048e] 15627 1726882499.59895: sending task result for task 0e448fcc-3ce9-2847-7723-00000000048e 15627 1726882499.59984: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000048e 15627 1726882499.59987: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 15627 1726882499.60032: no more pending results, returning what we have 15627 1726882499.60035: results queue empty 15627 1726882499.60036: checking for any_errors_fatal 15627 1726882499.60047: done checking for any_errors_fatal 15627 1726882499.60048: checking for max_fail_percentage 15627 1726882499.60049: done checking for max_fail_percentage 15627 1726882499.60050: checking to see if all hosts have failed and the running result is not ok 15627 1726882499.60051: done checking to see if all hosts have failed 15627 1726882499.60052: getting the remaining hosts for this loop 15627 1726882499.60057: done getting the remaining hosts for this loop 15627 1726882499.60061: getting the next task for host managed_node1 15627 1726882499.60074: done getting next task for host managed_node1 15627 1726882499.60077: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15627 1726882499.60081: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882499.60084: getting variables 15627 1726882499.60086: in VariableManager get_vars() 15627 1726882499.60112: Calling all_inventory to load vars for managed_node1 15627 1726882499.60115: Calling groups_inventory to load vars for managed_node1 15627 1726882499.60118: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882499.60127: Calling all_plugins_play to load vars for managed_node1 15627 1726882499.60130: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882499.60132: Calling groups_plugins_play to load vars for managed_node1 15627 1726882499.63138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882499.66786: done with get_vars() 15627 1726882499.66813: done getting variables 15627 1726882499.66876: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882499.67221: variable 'profile' from source: play vars 15627 1726882499.67226: variable 'interface' from source: set_fact 15627 1726882499.67288: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:34:59 -0400 (0:00:00.093) 0:00:39.425 ****** 15627 1726882499.67436: entering _queue_task() for managed_node1/command 15627 1726882499.68103: worker is 1 (out of 1 available) 15627 1726882499.68115: exiting _queue_task() for managed_node1/command 15627 1726882499.68127: done queuing things up, now waiting for results queue to drain 15627 1726882499.68129: waiting for pending results... 15627 1726882499.68972: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15627 1726882499.69089: in run() - task 0e448fcc-3ce9-2847-7723-000000000490 15627 1726882499.69104: variable 'ansible_search_path' from source: unknown 15627 1726882499.69108: variable 'ansible_search_path' from source: unknown 15627 1726882499.69143: calling self._execute() 15627 1726882499.69233: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882499.69238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882499.69249: variable 'omit' from source: magic vars 15627 1726882499.69677: variable 'ansible_distribution_major_version' from source: facts 15627 1726882499.69681: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882499.70454: variable 'profile_stat' from source: set_fact 15627 1726882499.70470: Evaluated conditional (profile_stat.stat.exists): False 15627 1726882499.70473: when evaluation is False, skipping this task 15627 1726882499.70475: _execute() done 15627 1726882499.70478: dumping result to json 15627 1726882499.70481: done dumping result, returning 15627 1726882499.70487: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [0e448fcc-3ce9-2847-7723-000000000490] 15627 1726882499.70492: sending task result for task 0e448fcc-3ce9-2847-7723-000000000490 15627 1726882499.70584: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000490 15627 1726882499.70587: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15627 1726882499.70658: no more pending results, returning what we have 15627 1726882499.70661: results queue empty 15627 1726882499.70662: checking for any_errors_fatal 15627 1726882499.70677: done checking for any_errors_fatal 15627 1726882499.70678: checking for max_fail_percentage 15627 1726882499.70680: done checking for max_fail_percentage 15627 1726882499.70681: checking to see if all hosts have failed and the running result is not ok 15627 1726882499.70682: done checking to see if all hosts have failed 15627 1726882499.70683: getting the remaining hosts for this loop 15627 1726882499.70685: done getting the remaining hosts for this loop 15627 1726882499.70689: getting the next task for host managed_node1 15627 1726882499.70698: done getting next task for host managed_node1 15627 1726882499.70701: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15627 1726882499.70706: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882499.70710: getting variables 15627 1726882499.70712: in VariableManager get_vars() 15627 1726882499.70743: Calling all_inventory to load vars for managed_node1 15627 1726882499.70746: Calling groups_inventory to load vars for managed_node1 15627 1726882499.70751: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882499.70766: Calling all_plugins_play to load vars for managed_node1 15627 1726882499.70770: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882499.70773: Calling groups_plugins_play to load vars for managed_node1 15627 1726882499.73662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882499.75770: done with get_vars() 15627 1726882499.75793: done getting variables 15627 1726882499.75856: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882499.75966: variable 'profile' from source: play vars 15627 1726882499.75970: variable 'interface' from source: set_fact 15627 1726882499.76022: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:34:59 -0400 (0:00:00.086) 0:00:39.512 ****** 15627 1726882499.76050: entering _queue_task() for managed_node1/set_fact 15627 1726882499.76396: worker is 1 (out of 1 available) 15627 1726882499.76409: exiting _queue_task() for managed_node1/set_fact 15627 1726882499.76427: done queuing things up, now waiting for results queue to drain 15627 1726882499.76428: waiting for pending results... 15627 1726882499.77381: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15627 1726882499.77851: in run() - task 0e448fcc-3ce9-2847-7723-000000000491 15627 1726882499.77875: variable 'ansible_search_path' from source: unknown 15627 1726882499.77879: variable 'ansible_search_path' from source: unknown 15627 1726882499.77912: calling self._execute() 15627 1726882499.78019: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882499.78023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882499.78036: variable 'omit' from source: magic vars 15627 1726882499.78519: variable 'ansible_distribution_major_version' from source: facts 15627 1726882499.78530: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882499.78670: variable 'profile_stat' from source: set_fact 15627 1726882499.79083: Evaluated conditional (profile_stat.stat.exists): False 15627 1726882499.79086: when evaluation is False, skipping this task 15627 1726882499.79088: _execute() done 15627 1726882499.79090: dumping result to json 15627 1726882499.79092: done dumping result, returning 15627 1726882499.79094: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [0e448fcc-3ce9-2847-7723-000000000491] 15627 1726882499.79096: sending task result for task 0e448fcc-3ce9-2847-7723-000000000491 15627 1726882499.79166: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000491 15627 1726882499.79170: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15627 1726882499.79219: no more pending results, returning what we have 15627 1726882499.79222: results queue empty 15627 1726882499.79223: checking for any_errors_fatal 15627 1726882499.79230: done checking for any_errors_fatal 15627 1726882499.79231: checking for max_fail_percentage 15627 1726882499.79233: done checking for max_fail_percentage 15627 1726882499.79233: checking to see if all hosts have failed and the running result is not ok 15627 1726882499.79235: done checking to see if all hosts have failed 15627 1726882499.79235: getting the remaining hosts for this loop 15627 1726882499.79237: done getting the remaining hosts for this loop 15627 1726882499.79241: getting the next task for host managed_node1 15627 1726882499.79248: done getting next task for host managed_node1 15627 1726882499.79251: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15627 1726882499.79257: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882499.79260: getting variables 15627 1726882499.79262: in VariableManager get_vars() 15627 1726882499.79294: Calling all_inventory to load vars for managed_node1 15627 1726882499.79298: Calling groups_inventory to load vars for managed_node1 15627 1726882499.79301: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882499.79314: Calling all_plugins_play to load vars for managed_node1 15627 1726882499.79317: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882499.79320: Calling groups_plugins_play to load vars for managed_node1 15627 1726882499.81341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882499.83299: done with get_vars() 15627 1726882499.83320: done getting variables 15627 1726882499.83378: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882499.83888: variable 'profile' from source: play vars 15627 1726882499.83892: variable 'interface' from source: set_fact 15627 1726882499.83956: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:34:59 -0400 (0:00:00.079) 0:00:39.591 ****** 15627 1726882499.83992: entering _queue_task() for managed_node1/command 15627 1726882499.84416: worker is 1 (out of 1 available) 15627 1726882499.84429: exiting _queue_task() for managed_node1/command 15627 1726882499.84442: done queuing things up, now waiting for results queue to drain 15627 1726882499.84444: waiting for pending results... 15627 1726882499.85108: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15627 1726882499.85207: in run() - task 0e448fcc-3ce9-2847-7723-000000000492 15627 1726882499.85227: variable 'ansible_search_path' from source: unknown 15627 1726882499.85233: variable 'ansible_search_path' from source: unknown 15627 1726882499.85281: calling self._execute() 15627 1726882499.85381: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882499.85385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882499.85395: variable 'omit' from source: magic vars 15627 1726882499.85776: variable 'ansible_distribution_major_version' from source: facts 15627 1726882499.86468: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882499.86472: variable 'profile_stat' from source: set_fact 15627 1726882499.86475: Evaluated conditional (profile_stat.stat.exists): False 15627 1726882499.86477: when evaluation is False, skipping this task 15627 1726882499.86479: _execute() done 15627 1726882499.86481: dumping result to json 15627 1726882499.86483: done dumping result, returning 15627 1726882499.86485: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [0e448fcc-3ce9-2847-7723-000000000492] 15627 1726882499.86486: sending task result for task 0e448fcc-3ce9-2847-7723-000000000492 15627 1726882499.86541: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000492 15627 1726882499.86545: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15627 1726882499.86590: no more pending results, returning what we have 15627 1726882499.86593: results queue empty 15627 1726882499.86594: checking for any_errors_fatal 15627 1726882499.86601: done checking for any_errors_fatal 15627 1726882499.86602: checking for max_fail_percentage 15627 1726882499.86603: done checking for max_fail_percentage 15627 1726882499.86604: checking to see if all hosts have failed and the running result is not ok 15627 1726882499.86605: done checking to see if all hosts have failed 15627 1726882499.86606: getting the remaining hosts for this loop 15627 1726882499.86607: done getting the remaining hosts for this loop 15627 1726882499.86610: getting the next task for host managed_node1 15627 1726882499.86617: done getting next task for host managed_node1 15627 1726882499.86620: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15627 1726882499.86624: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882499.86627: getting variables 15627 1726882499.86628: in VariableManager get_vars() 15627 1726882499.86653: Calling all_inventory to load vars for managed_node1 15627 1726882499.86658: Calling groups_inventory to load vars for managed_node1 15627 1726882499.86662: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882499.86673: Calling all_plugins_play to load vars for managed_node1 15627 1726882499.86676: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882499.86679: Calling groups_plugins_play to load vars for managed_node1 15627 1726882499.95977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882499.98049: done with get_vars() 15627 1726882499.98075: done getting variables 15627 1726882499.98122: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882499.98217: variable 'profile' from source: play vars 15627 1726882499.98220: variable 'interface' from source: set_fact 15627 1726882499.98278: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:34:59 -0400 (0:00:00.143) 0:00:39.734 ****** 15627 1726882499.98305: entering _queue_task() for managed_node1/set_fact 15627 1726882499.98711: worker is 1 (out of 1 available) 15627 1726882499.98740: exiting _queue_task() for managed_node1/set_fact 15627 1726882499.98764: done queuing things up, now waiting for results queue to drain 15627 1726882499.98766: waiting for pending results... 15627 1726882499.99088: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15627 1726882499.99227: in run() - task 0e448fcc-3ce9-2847-7723-000000000493 15627 1726882499.99248: variable 'ansible_search_path' from source: unknown 15627 1726882499.99251: variable 'ansible_search_path' from source: unknown 15627 1726882499.99308: calling self._execute() 15627 1726882499.99440: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882499.99444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882499.99474: variable 'omit' from source: magic vars 15627 1726882499.99989: variable 'ansible_distribution_major_version' from source: facts 15627 1726882500.00004: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882500.00166: variable 'profile_stat' from source: set_fact 15627 1726882500.00205: Evaluated conditional (profile_stat.stat.exists): False 15627 1726882500.00208: when evaluation is False, skipping this task 15627 1726882500.00227: _execute() done 15627 1726882500.00232: dumping result to json 15627 1726882500.00235: done dumping result, returning 15627 1726882500.00242: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [0e448fcc-3ce9-2847-7723-000000000493] 15627 1726882500.00246: sending task result for task 0e448fcc-3ce9-2847-7723-000000000493 15627 1726882500.00401: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000493 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15627 1726882500.00456: no more pending results, returning what we have 15627 1726882500.00459: results queue empty 15627 1726882500.00460: checking for any_errors_fatal 15627 1726882500.00477: done checking for any_errors_fatal 15627 1726882500.00478: checking for max_fail_percentage 15627 1726882500.00480: done checking for max_fail_percentage 15627 1726882500.00481: checking to see if all hosts have failed and the running result is not ok 15627 1726882500.00483: done checking to see if all hosts have failed 15627 1726882500.00483: getting the remaining hosts for this loop 15627 1726882500.00485: done getting the remaining hosts for this loop 15627 1726882500.00489: getting the next task for host managed_node1 15627 1726882500.00500: done getting next task for host managed_node1 15627 1726882500.00506: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 15627 1726882500.00512: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882500.00516: getting variables 15627 1726882500.00518: in VariableManager get_vars() 15627 1726882500.00554: Calling all_inventory to load vars for managed_node1 15627 1726882500.00557: Calling groups_inventory to load vars for managed_node1 15627 1726882500.00561: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882500.00569: WORKER PROCESS EXITING 15627 1726882500.00594: Calling all_plugins_play to load vars for managed_node1 15627 1726882500.00599: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882500.00603: Calling groups_plugins_play to load vars for managed_node1 15627 1726882500.02529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882500.04347: done with get_vars() 15627 1726882500.04375: done getting variables 15627 1726882500.04440: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882500.04573: variable 'profile' from source: play vars 15627 1726882500.04577: variable 'interface' from source: set_fact 15627 1726882500.04637: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'LSR-TST-br31'] ********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:35:00 -0400 (0:00:00.063) 0:00:39.798 ****** 15627 1726882500.04675: entering _queue_task() for managed_node1/assert 15627 1726882500.04988: worker is 1 (out of 1 available) 15627 1726882500.04999: exiting _queue_task() for managed_node1/assert 15627 1726882500.05011: done queuing things up, now waiting for results queue to drain 15627 1726882500.05013: waiting for pending results... 15627 1726882500.05319: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'LSR-TST-br31' 15627 1726882500.05417: in run() - task 0e448fcc-3ce9-2847-7723-000000000480 15627 1726882500.05429: variable 'ansible_search_path' from source: unknown 15627 1726882500.05433: variable 'ansible_search_path' from source: unknown 15627 1726882500.05470: calling self._execute() 15627 1726882500.05557: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882500.05563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882500.05575: variable 'omit' from source: magic vars 15627 1726882500.05842: variable 'ansible_distribution_major_version' from source: facts 15627 1726882500.05851: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882500.05859: variable 'omit' from source: magic vars 15627 1726882500.05891: variable 'omit' from source: magic vars 15627 1726882500.05962: variable 'profile' from source: play vars 15627 1726882500.05967: variable 'interface' from source: set_fact 15627 1726882500.06011: variable 'interface' from source: set_fact 15627 1726882500.06027: variable 'omit' from source: magic vars 15627 1726882500.06062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882500.06090: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882500.06107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882500.06119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882500.06131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882500.06153: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882500.06162: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882500.06166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882500.06231: Set connection var ansible_timeout to 10 15627 1726882500.06238: Set connection var ansible_shell_executable to /bin/sh 15627 1726882500.06243: Set connection var ansible_connection to ssh 15627 1726882500.06248: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882500.06253: Set connection var ansible_pipelining to False 15627 1726882500.06256: Set connection var ansible_shell_type to sh 15627 1726882500.06280: variable 'ansible_shell_executable' from source: unknown 15627 1726882500.06283: variable 'ansible_connection' from source: unknown 15627 1726882500.06286: variable 'ansible_module_compression' from source: unknown 15627 1726882500.06289: variable 'ansible_shell_type' from source: unknown 15627 1726882500.06291: variable 'ansible_shell_executable' from source: unknown 15627 1726882500.06294: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882500.06297: variable 'ansible_pipelining' from source: unknown 15627 1726882500.06299: variable 'ansible_timeout' from source: unknown 15627 1726882500.06301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882500.06402: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882500.06412: variable 'omit' from source: magic vars 15627 1726882500.06417: starting attempt loop 15627 1726882500.06420: running the handler 15627 1726882500.06504: variable 'lsr_net_profile_exists' from source: set_fact 15627 1726882500.06508: Evaluated conditional (not lsr_net_profile_exists): True 15627 1726882500.06513: handler run complete 15627 1726882500.06525: attempt loop complete, returning result 15627 1726882500.06528: _execute() done 15627 1726882500.06531: dumping result to json 15627 1726882500.06535: done dumping result, returning 15627 1726882500.06539: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'LSR-TST-br31' [0e448fcc-3ce9-2847-7723-000000000480] 15627 1726882500.06544: sending task result for task 0e448fcc-3ce9-2847-7723-000000000480 15627 1726882500.06631: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000480 15627 1726882500.06634: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15627 1726882500.06694: no more pending results, returning what we have 15627 1726882500.06697: results queue empty 15627 1726882500.06698: checking for any_errors_fatal 15627 1726882500.06706: done checking for any_errors_fatal 15627 1726882500.06707: checking for max_fail_percentage 15627 1726882500.06708: done checking for max_fail_percentage 15627 1726882500.06710: checking to see if all hosts have failed and the running result is not ok 15627 1726882500.06711: done checking to see if all hosts have failed 15627 1726882500.06712: getting the remaining hosts for this loop 15627 1726882500.06713: done getting the remaining hosts for this loop 15627 1726882500.06716: getting the next task for host managed_node1 15627 1726882500.06725: done getting next task for host managed_node1 15627 1726882500.06726: ^ task is: TASK: meta (flush_handlers) 15627 1726882500.06728: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882500.06732: getting variables 15627 1726882500.06733: in VariableManager get_vars() 15627 1726882500.06755: Calling all_inventory to load vars for managed_node1 15627 1726882500.06758: Calling groups_inventory to load vars for managed_node1 15627 1726882500.06761: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882500.06772: Calling all_plugins_play to load vars for managed_node1 15627 1726882500.06775: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882500.06777: Calling groups_plugins_play to load vars for managed_node1 15627 1726882500.07811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882500.09108: done with get_vars() 15627 1726882500.09123: done getting variables 15627 1726882500.09172: in VariableManager get_vars() 15627 1726882500.09179: Calling all_inventory to load vars for managed_node1 15627 1726882500.09180: Calling groups_inventory to load vars for managed_node1 15627 1726882500.09182: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882500.09186: Calling all_plugins_play to load vars for managed_node1 15627 1726882500.09187: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882500.09189: Calling groups_plugins_play to load vars for managed_node1 15627 1726882500.09941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882500.11110: done with get_vars() 15627 1726882500.11133: done queuing things up, now waiting for results queue to drain 15627 1726882500.11135: results queue empty 15627 1726882500.11136: checking for any_errors_fatal 15627 1726882500.11138: done checking for any_errors_fatal 15627 1726882500.11139: checking for max_fail_percentage 15627 1726882500.11139: done checking for max_fail_percentage 15627 1726882500.11140: checking to see if all hosts have failed and the running result is not ok 15627 1726882500.11145: done checking to see if all hosts have failed 15627 1726882500.11146: getting the remaining hosts for this loop 15627 1726882500.11147: done getting the remaining hosts for this loop 15627 1726882500.11150: getting the next task for host managed_node1 15627 1726882500.11154: done getting next task for host managed_node1 15627 1726882500.11155: ^ task is: TASK: meta (flush_handlers) 15627 1726882500.11156: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882500.11159: getting variables 15627 1726882500.11160: in VariableManager get_vars() 15627 1726882500.11169: Calling all_inventory to load vars for managed_node1 15627 1726882500.11171: Calling groups_inventory to load vars for managed_node1 15627 1726882500.11173: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882500.11177: Calling all_plugins_play to load vars for managed_node1 15627 1726882500.11180: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882500.11182: Calling groups_plugins_play to load vars for managed_node1 15627 1726882500.12366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882500.13424: done with get_vars() 15627 1726882500.13437: done getting variables 15627 1726882500.13471: in VariableManager get_vars() 15627 1726882500.13477: Calling all_inventory to load vars for managed_node1 15627 1726882500.13478: Calling groups_inventory to load vars for managed_node1 15627 1726882500.13479: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882500.13482: Calling all_plugins_play to load vars for managed_node1 15627 1726882500.13484: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882500.13487: Calling groups_plugins_play to load vars for managed_node1 15627 1726882500.14160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882500.15500: done with get_vars() 15627 1726882500.15523: done queuing things up, now waiting for results queue to drain 15627 1726882500.15528: results queue empty 15627 1726882500.15529: checking for any_errors_fatal 15627 1726882500.15530: done checking for any_errors_fatal 15627 1726882500.15531: checking for max_fail_percentage 15627 1726882500.15532: done checking for max_fail_percentage 15627 1726882500.15533: checking to see if all hosts have failed and the running result is not ok 15627 1726882500.15534: done checking to see if all hosts have failed 15627 1726882500.15534: getting the remaining hosts for this loop 15627 1726882500.15535: done getting the remaining hosts for this loop 15627 1726882500.15538: getting the next task for host managed_node1 15627 1726882500.15541: done getting next task for host managed_node1 15627 1726882500.15542: ^ task is: None 15627 1726882500.15543: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882500.15544: done queuing things up, now waiting for results queue to drain 15627 1726882500.15545: results queue empty 15627 1726882500.15546: checking for any_errors_fatal 15627 1726882500.15546: done checking for any_errors_fatal 15627 1726882500.15547: checking for max_fail_percentage 15627 1726882500.15548: done checking for max_fail_percentage 15627 1726882500.15549: checking to see if all hosts have failed and the running result is not ok 15627 1726882500.15549: done checking to see if all hosts have failed 15627 1726882500.15550: getting the next task for host managed_node1 15627 1726882500.15552: done getting next task for host managed_node1 15627 1726882500.15553: ^ task is: None 15627 1726882500.15554: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882500.15596: in VariableManager get_vars() 15627 1726882500.15610: done with get_vars() 15627 1726882500.15616: in VariableManager get_vars() 15627 1726882500.15627: done with get_vars() 15627 1726882500.15630: variable 'omit' from source: magic vars 15627 1726882500.15718: variable 'task' from source: play vars 15627 1726882500.15752: in VariableManager get_vars() 15627 1726882500.15759: done with get_vars() 15627 1726882500.15777: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_absent.yml] ************************* 15627 1726882500.15942: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15627 1726882500.15971: getting the remaining hosts for this loop 15627 1726882500.15973: done getting the remaining hosts for this loop 15627 1726882500.15975: getting the next task for host managed_node1 15627 1726882500.15978: done getting next task for host managed_node1 15627 1726882500.15980: ^ task is: TASK: Gathering Facts 15627 1726882500.15981: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882500.15983: getting variables 15627 1726882500.15985: in VariableManager get_vars() 15627 1726882500.15993: Calling all_inventory to load vars for managed_node1 15627 1726882500.15995: Calling groups_inventory to load vars for managed_node1 15627 1726882500.15997: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882500.16002: Calling all_plugins_play to load vars for managed_node1 15627 1726882500.16004: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882500.16006: Calling groups_plugins_play to load vars for managed_node1 15627 1726882500.17299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882500.18243: done with get_vars() 15627 1726882500.18256: done getting variables 15627 1726882500.18287: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 21:35:00 -0400 (0:00:00.136) 0:00:39.934 ****** 15627 1726882500.18304: entering _queue_task() for managed_node1/gather_facts 15627 1726882500.18524: worker is 1 (out of 1 available) 15627 1726882500.18536: exiting _queue_task() for managed_node1/gather_facts 15627 1726882500.18547: done queuing things up, now waiting for results queue to drain 15627 1726882500.18548: waiting for pending results... 15627 1726882500.18723: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15627 1726882500.18792: in run() - task 0e448fcc-3ce9-2847-7723-0000000004c5 15627 1726882500.18804: variable 'ansible_search_path' from source: unknown 15627 1726882500.18833: calling self._execute() 15627 1726882500.18903: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882500.18908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882500.18916: variable 'omit' from source: magic vars 15627 1726882500.19270: variable 'ansible_distribution_major_version' from source: facts 15627 1726882500.19289: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882500.19301: variable 'omit' from source: magic vars 15627 1726882500.19332: variable 'omit' from source: magic vars 15627 1726882500.19375: variable 'omit' from source: magic vars 15627 1726882500.19420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882500.19465: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882500.19495: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882500.19517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882500.19534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882500.19573: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882500.19581: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882500.19588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882500.19691: Set connection var ansible_timeout to 10 15627 1726882500.19706: Set connection var ansible_shell_executable to /bin/sh 15627 1726882500.19716: Set connection var ansible_connection to ssh 15627 1726882500.19725: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882500.19734: Set connection var ansible_pipelining to False 15627 1726882500.19740: Set connection var ansible_shell_type to sh 15627 1726882500.19769: variable 'ansible_shell_executable' from source: unknown 15627 1726882500.19777: variable 'ansible_connection' from source: unknown 15627 1726882500.19784: variable 'ansible_module_compression' from source: unknown 15627 1726882500.19790: variable 'ansible_shell_type' from source: unknown 15627 1726882500.19796: variable 'ansible_shell_executable' from source: unknown 15627 1726882500.19803: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882500.19810: variable 'ansible_pipelining' from source: unknown 15627 1726882500.19817: variable 'ansible_timeout' from source: unknown 15627 1726882500.19824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882500.20100: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882500.20121: variable 'omit' from source: magic vars 15627 1726882500.20130: starting attempt loop 15627 1726882500.20135: running the handler 15627 1726882500.20158: variable 'ansible_facts' from source: unknown 15627 1726882500.20193: _low_level_execute_command(): starting 15627 1726882500.20219: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882500.20749: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882500.20786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882500.20789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882500.20792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882500.20851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882500.20855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882500.20958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882500.22620: stdout chunk (state=3): >>>/root <<< 15627 1726882500.22736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882500.22861: stderr chunk (state=3): >>><<< 15627 1726882500.22878: stdout chunk (state=3): >>><<< 15627 1726882500.22926: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882500.22966: _low_level_execute_command(): starting 15627 1726882500.22971: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882500.2292476-17337-183582711408547 `" && echo ansible-tmp-1726882500.2292476-17337-183582711408547="` echo /root/.ansible/tmp/ansible-tmp-1726882500.2292476-17337-183582711408547 `" ) && sleep 0' 15627 1726882500.23759: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882500.23763: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882500.23777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882500.23780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882500.23818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882500.23852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882500.23972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882500.23991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882500.23998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882500.24102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882500.25993: stdout chunk (state=3): >>>ansible-tmp-1726882500.2292476-17337-183582711408547=/root/.ansible/tmp/ansible-tmp-1726882500.2292476-17337-183582711408547 <<< 15627 1726882500.26092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882500.26140: stderr chunk (state=3): >>><<< 15627 1726882500.26145: stdout chunk (state=3): >>><<< 15627 1726882500.26180: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882500.2292476-17337-183582711408547=/root/.ansible/tmp/ansible-tmp-1726882500.2292476-17337-183582711408547 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882500.26201: variable 'ansible_module_compression' from source: unknown 15627 1726882500.26290: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15627 1726882500.26333: variable 'ansible_facts' from source: unknown 15627 1726882500.26475: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882500.2292476-17337-183582711408547/AnsiballZ_setup.py 15627 1726882500.26592: Sending initial data 15627 1726882500.26601: Sent initial data (154 bytes) 15627 1726882500.27299: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882500.27317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882500.27335: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882500.27345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882500.27393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882500.27405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882500.27505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882500.29253: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15627 1726882500.29278: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15627 1726882500.29294: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15627 1726882500.29306: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 15627 1726882500.29329: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882500.29429: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882500.29503: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpyw9egxof /root/.ansible/tmp/ansible-tmp-1726882500.2292476-17337-183582711408547/AnsiballZ_setup.py <<< 15627 1726882500.29609: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882500.32019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882500.32108: stderr chunk (state=3): >>><<< 15627 1726882500.32111: stdout chunk (state=3): >>><<< 15627 1726882500.32129: done transferring module to remote 15627 1726882500.32138: _low_level_execute_command(): starting 15627 1726882500.32142: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882500.2292476-17337-183582711408547/ /root/.ansible/tmp/ansible-tmp-1726882500.2292476-17337-183582711408547/AnsiballZ_setup.py && sleep 0' 15627 1726882500.32642: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882500.32671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882500.32723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882500.32738: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882500.32750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882500.32760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882500.32812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882500.32831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882500.32921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882500.34723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882500.34757: stderr chunk (state=3): >>><<< 15627 1726882500.34761: stdout chunk (state=3): >>><<< 15627 1726882500.34795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882500.34808: _low_level_execute_command(): starting 15627 1726882500.34818: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882500.2292476-17337-183582711408547/AnsiballZ_setup.py && sleep 0' 15627 1726882500.35376: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882500.35380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882500.35417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882500.35499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882500.35522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882500.35611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882500.86696: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2810, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 722, "free": 2810}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3d<<< 15627 1726882500.86720: stdout chunk (state=3): >>>dcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 658, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241369088, "block_size": 4096, "block_total": 65519355, "block_available": 64512053, "block_used": 1007302, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.5, "5m": 0.39, "15m": 0.2}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "00", "epoch": "1726882500", "epoch_int": "1726882500", "date": "2024-09-20", "time": "21:35:00", "iso8601_micro": "2024-09-21T01:35:00.862414Z", "iso8601": "2024-09-21T01:35:00Z", "iso8601_basic": "20240920T213500862414", "iso8601_basic_short": "20240920T213500", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15627 1726882500.88274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882500.88324: stderr chunk (state=3): >>><<< 15627 1726882500.88329: stdout chunk (state=3): >>><<< 15627 1726882500.88362: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2810, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 722, "free": 2810}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 658, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241369088, "block_size": 4096, "block_total": 65519355, "block_available": 64512053, "block_used": 1007302, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.5, "5m": 0.39, "15m": 0.2}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "00", "epoch": "1726882500", "epoch_int": "1726882500", "date": "2024-09-20", "time": "21:35:00", "iso8601_micro": "2024-09-21T01:35:00.862414Z", "iso8601": "2024-09-21T01:35:00Z", "iso8601_basic": "20240920T213500862414", "iso8601_basic_short": "20240920T213500", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882500.88597: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882500.2292476-17337-183582711408547/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882500.88615: _low_level_execute_command(): starting 15627 1726882500.88619: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882500.2292476-17337-183582711408547/ > /dev/null 2>&1 && sleep 0' 15627 1726882500.89376: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882500.89393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882500.89411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882500.89423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882500.89471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882500.89474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882500.89496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882500.89567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882500.89579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882500.89696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882500.91486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882500.91552: stderr chunk (state=3): >>><<< 15627 1726882500.91557: stdout chunk (state=3): >>><<< 15627 1726882500.91584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882500.91602: handler run complete 15627 1726882500.91804: variable 'ansible_facts' from source: unknown 15627 1726882500.91933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882500.92298: variable 'ansible_facts' from source: unknown 15627 1726882500.92351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882500.92458: attempt loop complete, returning result 15627 1726882500.92477: _execute() done 15627 1726882500.92485: dumping result to json 15627 1726882500.92526: done dumping result, returning 15627 1726882500.92539: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-2847-7723-0000000004c5] 15627 1726882500.92542: sending task result for task 0e448fcc-3ce9-2847-7723-0000000004c5 15627 1726882500.92894: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000004c5 15627 1726882500.92898: WORKER PROCESS EXITING ok: [managed_node1] 15627 1726882500.93132: no more pending results, returning what we have 15627 1726882500.93134: results queue empty 15627 1726882500.93135: checking for any_errors_fatal 15627 1726882500.93136: done checking for any_errors_fatal 15627 1726882500.93136: checking for max_fail_percentage 15627 1726882500.93137: done checking for max_fail_percentage 15627 1726882500.93138: checking to see if all hosts have failed and the running result is not ok 15627 1726882500.93139: done checking to see if all hosts have failed 15627 1726882500.93139: getting the remaining hosts for this loop 15627 1726882500.93140: done getting the remaining hosts for this loop 15627 1726882500.93142: getting the next task for host managed_node1 15627 1726882500.93146: done getting next task for host managed_node1 15627 1726882500.93147: ^ task is: TASK: meta (flush_handlers) 15627 1726882500.93148: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882500.93151: getting variables 15627 1726882500.93152: in VariableManager get_vars() 15627 1726882500.93170: Calling all_inventory to load vars for managed_node1 15627 1726882500.93172: Calling groups_inventory to load vars for managed_node1 15627 1726882500.93174: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882500.93182: Calling all_plugins_play to load vars for managed_node1 15627 1726882500.93184: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882500.93186: Calling groups_plugins_play to load vars for managed_node1 15627 1726882500.93964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882500.95012: done with get_vars() 15627 1726882500.95027: done getting variables 15627 1726882500.95081: in VariableManager get_vars() 15627 1726882500.95088: Calling all_inventory to load vars for managed_node1 15627 1726882500.95089: Calling groups_inventory to load vars for managed_node1 15627 1726882500.95091: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882500.95094: Calling all_plugins_play to load vars for managed_node1 15627 1726882500.95095: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882500.95097: Calling groups_plugins_play to load vars for managed_node1 15627 1726882500.96177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882500.98076: done with get_vars() 15627 1726882500.98102: done queuing things up, now waiting for results queue to drain 15627 1726882500.98104: results queue empty 15627 1726882500.98105: checking for any_errors_fatal 15627 1726882500.98108: done checking for any_errors_fatal 15627 1726882500.98109: checking for max_fail_percentage 15627 1726882500.98113: done checking for max_fail_percentage 15627 1726882500.98114: checking to see if all hosts have failed and the running result is not ok 15627 1726882500.98114: done checking to see if all hosts have failed 15627 1726882500.98115: getting the remaining hosts for this loop 15627 1726882500.98116: done getting the remaining hosts for this loop 15627 1726882500.98118: getting the next task for host managed_node1 15627 1726882500.98123: done getting next task for host managed_node1 15627 1726882500.98125: ^ task is: TASK: Include the task '{{ task }}' 15627 1726882500.98127: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882500.98129: getting variables 15627 1726882500.98130: in VariableManager get_vars() 15627 1726882500.98138: Calling all_inventory to load vars for managed_node1 15627 1726882500.98140: Calling groups_inventory to load vars for managed_node1 15627 1726882500.98142: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882500.98147: Calling all_plugins_play to load vars for managed_node1 15627 1726882500.98149: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882500.98152: Calling groups_plugins_play to load vars for managed_node1 15627 1726882500.99801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882501.01220: done with get_vars() 15627 1726882501.01233: done getting variables 15627 1726882501.01391: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_absent.yml'] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 21:35:01 -0400 (0:00:00.831) 0:00:40.765 ****** 15627 1726882501.01425: entering _queue_task() for managed_node1/include_tasks 15627 1726882501.01697: worker is 1 (out of 1 available) 15627 1726882501.01714: exiting _queue_task() for managed_node1/include_tasks 15627 1726882501.01731: done queuing things up, now waiting for results queue to drain 15627 1726882501.01733: waiting for pending results... 15627 1726882501.02001: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_absent.yml' 15627 1726882501.02121: in run() - task 0e448fcc-3ce9-2847-7723-000000000077 15627 1726882501.02146: variable 'ansible_search_path' from source: unknown 15627 1726882501.02188: calling self._execute() 15627 1726882501.02257: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882501.02266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882501.02276: variable 'omit' from source: magic vars 15627 1726882501.02562: variable 'ansible_distribution_major_version' from source: facts 15627 1726882501.02570: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882501.02585: variable 'task' from source: play vars 15627 1726882501.02635: variable 'task' from source: play vars 15627 1726882501.02641: _execute() done 15627 1726882501.02644: dumping result to json 15627 1726882501.02647: done dumping result, returning 15627 1726882501.02652: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_absent.yml' [0e448fcc-3ce9-2847-7723-000000000077] 15627 1726882501.02661: sending task result for task 0e448fcc-3ce9-2847-7723-000000000077 15627 1726882501.02759: done sending task result for task 0e448fcc-3ce9-2847-7723-000000000077 15627 1726882501.02761: WORKER PROCESS EXITING 15627 1726882501.02914: no more pending results, returning what we have 15627 1726882501.02918: in VariableManager get_vars() 15627 1726882501.02972: Calling all_inventory to load vars for managed_node1 15627 1726882501.02975: Calling groups_inventory to load vars for managed_node1 15627 1726882501.02977: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882501.02985: Calling all_plugins_play to load vars for managed_node1 15627 1726882501.02986: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882501.02988: Calling groups_plugins_play to load vars for managed_node1 15627 1726882501.03784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882501.04720: done with get_vars() 15627 1726882501.04735: variable 'ansible_search_path' from source: unknown 15627 1726882501.04744: we have included files to process 15627 1726882501.04745: generating all_blocks data 15627 1726882501.04746: done generating all_blocks data 15627 1726882501.04746: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15627 1726882501.04747: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15627 1726882501.04748: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15627 1726882501.04817: in VariableManager get_vars() 15627 1726882501.04827: done with get_vars() 15627 1726882501.04902: done processing included file 15627 1726882501.04903: iterating over new_blocks loaded from include file 15627 1726882501.04904: in VariableManager get_vars() 15627 1726882501.04911: done with get_vars() 15627 1726882501.04912: filtering new block on tags 15627 1726882501.04922: done filtering new block on tags 15627 1726882501.04924: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 15627 1726882501.04927: extending task lists for all hosts with included blocks 15627 1726882501.04944: done extending task lists 15627 1726882501.04945: done processing included files 15627 1726882501.04945: results queue empty 15627 1726882501.04946: checking for any_errors_fatal 15627 1726882501.04947: done checking for any_errors_fatal 15627 1726882501.04948: checking for max_fail_percentage 15627 1726882501.04949: done checking for max_fail_percentage 15627 1726882501.04950: checking to see if all hosts have failed and the running result is not ok 15627 1726882501.04950: done checking to see if all hosts have failed 15627 1726882501.04951: getting the remaining hosts for this loop 15627 1726882501.04952: done getting the remaining hosts for this loop 15627 1726882501.04954: getting the next task for host managed_node1 15627 1726882501.04957: done getting next task for host managed_node1 15627 1726882501.04958: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15627 1726882501.04960: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882501.04961: getting variables 15627 1726882501.04962: in VariableManager get_vars() 15627 1726882501.04969: Calling all_inventory to load vars for managed_node1 15627 1726882501.04970: Calling groups_inventory to load vars for managed_node1 15627 1726882501.04972: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882501.04975: Calling all_plugins_play to load vars for managed_node1 15627 1726882501.04977: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882501.04978: Calling groups_plugins_play to load vars for managed_node1 15627 1726882501.05682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882501.06589: done with get_vars() 15627 1726882501.06602: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:35:01 -0400 (0:00:00.052) 0:00:40.818 ****** 15627 1726882501.06649: entering _queue_task() for managed_node1/include_tasks 15627 1726882501.06851: worker is 1 (out of 1 available) 15627 1726882501.06862: exiting _queue_task() for managed_node1/include_tasks 15627 1726882501.06875: done queuing things up, now waiting for results queue to drain 15627 1726882501.06876: waiting for pending results... 15627 1726882501.07041: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 15627 1726882501.07118: in run() - task 0e448fcc-3ce9-2847-7723-0000000004d6 15627 1726882501.07133: variable 'ansible_search_path' from source: unknown 15627 1726882501.07137: variable 'ansible_search_path' from source: unknown 15627 1726882501.07173: calling self._execute() 15627 1726882501.07243: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882501.07250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882501.07260: variable 'omit' from source: magic vars 15627 1726882501.07521: variable 'ansible_distribution_major_version' from source: facts 15627 1726882501.07532: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882501.07537: _execute() done 15627 1726882501.07542: dumping result to json 15627 1726882501.07545: done dumping result, returning 15627 1726882501.07551: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-2847-7723-0000000004d6] 15627 1726882501.07567: sending task result for task 0e448fcc-3ce9-2847-7723-0000000004d6 15627 1726882501.07643: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000004d6 15627 1726882501.07646: WORKER PROCESS EXITING 15627 1726882501.07681: no more pending results, returning what we have 15627 1726882501.07685: in VariableManager get_vars() 15627 1726882501.07716: Calling all_inventory to load vars for managed_node1 15627 1726882501.07719: Calling groups_inventory to load vars for managed_node1 15627 1726882501.07722: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882501.07733: Calling all_plugins_play to load vars for managed_node1 15627 1726882501.07736: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882501.07738: Calling groups_plugins_play to load vars for managed_node1 15627 1726882501.08513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882501.09530: done with get_vars() 15627 1726882501.09542: variable 'ansible_search_path' from source: unknown 15627 1726882501.09543: variable 'ansible_search_path' from source: unknown 15627 1726882501.09548: variable 'task' from source: play vars 15627 1726882501.09622: variable 'task' from source: play vars 15627 1726882501.09645: we have included files to process 15627 1726882501.09646: generating all_blocks data 15627 1726882501.09647: done generating all_blocks data 15627 1726882501.09648: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15627 1726882501.09648: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15627 1726882501.09650: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15627 1726882501.09769: done processing included file 15627 1726882501.09771: iterating over new_blocks loaded from include file 15627 1726882501.09772: in VariableManager get_vars() 15627 1726882501.09780: done with get_vars() 15627 1726882501.09781: filtering new block on tags 15627 1726882501.09790: done filtering new block on tags 15627 1726882501.09791: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 15627 1726882501.09794: extending task lists for all hosts with included blocks 15627 1726882501.09852: done extending task lists 15627 1726882501.09853: done processing included files 15627 1726882501.09854: results queue empty 15627 1726882501.09855: checking for any_errors_fatal 15627 1726882501.09857: done checking for any_errors_fatal 15627 1726882501.09857: checking for max_fail_percentage 15627 1726882501.09858: done checking for max_fail_percentage 15627 1726882501.09859: checking to see if all hosts have failed and the running result is not ok 15627 1726882501.09859: done checking to see if all hosts have failed 15627 1726882501.09860: getting the remaining hosts for this loop 15627 1726882501.09861: done getting the remaining hosts for this loop 15627 1726882501.09862: getting the next task for host managed_node1 15627 1726882501.09867: done getting next task for host managed_node1 15627 1726882501.09868: ^ task is: TASK: Get stat for interface {{ interface }} 15627 1726882501.09870: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882501.09871: getting variables 15627 1726882501.09872: in VariableManager get_vars() 15627 1726882501.09877: Calling all_inventory to load vars for managed_node1 15627 1726882501.09879: Calling groups_inventory to load vars for managed_node1 15627 1726882501.09880: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882501.09883: Calling all_plugins_play to load vars for managed_node1 15627 1726882501.09885: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882501.09886: Calling groups_plugins_play to load vars for managed_node1 15627 1726882501.10562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882501.11475: done with get_vars() 15627 1726882501.11488: done getting variables 15627 1726882501.11569: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:35:01 -0400 (0:00:00.049) 0:00:40.867 ****** 15627 1726882501.11588: entering _queue_task() for managed_node1/stat 15627 1726882501.11794: worker is 1 (out of 1 available) 15627 1726882501.11806: exiting _queue_task() for managed_node1/stat 15627 1726882501.11818: done queuing things up, now waiting for results queue to drain 15627 1726882501.11819: waiting for pending results... 15627 1726882501.11995: running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 15627 1726882501.12070: in run() - task 0e448fcc-3ce9-2847-7723-0000000004e1 15627 1726882501.12080: variable 'ansible_search_path' from source: unknown 15627 1726882501.12084: variable 'ansible_search_path' from source: unknown 15627 1726882501.12110: calling self._execute() 15627 1726882501.12182: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882501.12186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882501.12196: variable 'omit' from source: magic vars 15627 1726882501.12457: variable 'ansible_distribution_major_version' from source: facts 15627 1726882501.12471: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882501.12477: variable 'omit' from source: magic vars 15627 1726882501.12509: variable 'omit' from source: magic vars 15627 1726882501.12578: variable 'interface' from source: set_fact 15627 1726882501.12591: variable 'omit' from source: magic vars 15627 1726882501.12624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882501.12648: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882501.12672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882501.12685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882501.12696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882501.12719: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882501.12722: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882501.12725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882501.12794: Set connection var ansible_timeout to 10 15627 1726882501.12801: Set connection var ansible_shell_executable to /bin/sh 15627 1726882501.12807: Set connection var ansible_connection to ssh 15627 1726882501.12812: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882501.12817: Set connection var ansible_pipelining to False 15627 1726882501.12820: Set connection var ansible_shell_type to sh 15627 1726882501.12836: variable 'ansible_shell_executable' from source: unknown 15627 1726882501.12840: variable 'ansible_connection' from source: unknown 15627 1726882501.12842: variable 'ansible_module_compression' from source: unknown 15627 1726882501.12844: variable 'ansible_shell_type' from source: unknown 15627 1726882501.12847: variable 'ansible_shell_executable' from source: unknown 15627 1726882501.12849: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882501.12853: variable 'ansible_pipelining' from source: unknown 15627 1726882501.12855: variable 'ansible_timeout' from source: unknown 15627 1726882501.12861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882501.13005: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 15627 1726882501.13013: variable 'omit' from source: magic vars 15627 1726882501.13018: starting attempt loop 15627 1726882501.13021: running the handler 15627 1726882501.13032: _low_level_execute_command(): starting 15627 1726882501.13039: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882501.13556: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.13575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.13593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882501.13606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882501.13617: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.13659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882501.13675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882501.13788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882501.15439: stdout chunk (state=3): >>>/root <<< 15627 1726882501.15540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882501.15594: stderr chunk (state=3): >>><<< 15627 1726882501.15597: stdout chunk (state=3): >>><<< 15627 1726882501.15617: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882501.15628: _low_level_execute_command(): starting 15627 1726882501.15634: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882501.1561623-17369-203476023163028 `" && echo ansible-tmp-1726882501.1561623-17369-203476023163028="` echo /root/.ansible/tmp/ansible-tmp-1726882501.1561623-17369-203476023163028 `" ) && sleep 0' 15627 1726882501.16066: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882501.16079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.16098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882501.16115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.16163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882501.16182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882501.16274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882501.18124: stdout chunk (state=3): >>>ansible-tmp-1726882501.1561623-17369-203476023163028=/root/.ansible/tmp/ansible-tmp-1726882501.1561623-17369-203476023163028 <<< 15627 1726882501.18234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882501.18284: stderr chunk (state=3): >>><<< 15627 1726882501.18287: stdout chunk (state=3): >>><<< 15627 1726882501.18302: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882501.1561623-17369-203476023163028=/root/.ansible/tmp/ansible-tmp-1726882501.1561623-17369-203476023163028 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882501.18337: variable 'ansible_module_compression' from source: unknown 15627 1726882501.18387: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15627 1726882501.18415: variable 'ansible_facts' from source: unknown 15627 1726882501.18469: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882501.1561623-17369-203476023163028/AnsiballZ_stat.py 15627 1726882501.18566: Sending initial data 15627 1726882501.18576: Sent initial data (153 bytes) 15627 1726882501.19233: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882501.19237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.19274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882501.19277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.19280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.19332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882501.19339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882501.19432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882501.21156: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882501.21242: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882501.21337: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmp2h7fc1n2 /root/.ansible/tmp/ansible-tmp-1726882501.1561623-17369-203476023163028/AnsiballZ_stat.py <<< 15627 1726882501.21425: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882501.22644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882501.22852: stderr chunk (state=3): >>><<< 15627 1726882501.22858: stdout chunk (state=3): >>><<< 15627 1726882501.22860: done transferring module to remote 15627 1726882501.22871: _low_level_execute_command(): starting 15627 1726882501.22882: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882501.1561623-17369-203476023163028/ /root/.ansible/tmp/ansible-tmp-1726882501.1561623-17369-203476023163028/AnsiballZ_stat.py && sleep 0' 15627 1726882501.23602: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882501.23605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.23636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.23639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.23641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.23697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882501.23700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882501.23796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882501.25518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882501.25580: stderr chunk (state=3): >>><<< 15627 1726882501.25583: stdout chunk (state=3): >>><<< 15627 1726882501.25674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882501.25680: _low_level_execute_command(): starting 15627 1726882501.25684: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882501.1561623-17369-203476023163028/AnsiballZ_stat.py && sleep 0' 15627 1726882501.26294: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882501.26308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882501.26330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.26353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.26399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882501.26412: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882501.26427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.26457: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882501.26474: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882501.26486: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882501.26498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882501.26511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.26527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.26538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882501.26563: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882501.26579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.26657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882501.26696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882501.26714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882501.26841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882501.39902: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15627 1726882501.40926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882501.41007: stderr chunk (state=3): >>><<< 15627 1726882501.41011: stdout chunk (state=3): >>><<< 15627 1726882501.41147: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882501.41152: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882501.1561623-17369-203476023163028/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882501.41155: _low_level_execute_command(): starting 15627 1726882501.41157: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882501.1561623-17369-203476023163028/ > /dev/null 2>&1 && sleep 0' 15627 1726882501.41720: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882501.41735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882501.41750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.41772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.41816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882501.41829: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882501.41843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.41862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882501.41880: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882501.41891: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882501.41904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882501.41918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.41937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.41949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882501.41960: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882501.41977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.42046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882501.42070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882501.42085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882501.42211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882501.44104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882501.44108: stdout chunk (state=3): >>><<< 15627 1726882501.44110: stderr chunk (state=3): >>><<< 15627 1726882501.44517: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882501.44529: handler run complete 15627 1726882501.44531: attempt loop complete, returning result 15627 1726882501.44534: _execute() done 15627 1726882501.44536: dumping result to json 15627 1726882501.44538: done dumping result, returning 15627 1726882501.44540: done running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 [0e448fcc-3ce9-2847-7723-0000000004e1] 15627 1726882501.44542: sending task result for task 0e448fcc-3ce9-2847-7723-0000000004e1 15627 1726882501.44617: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000004e1 15627 1726882501.44621: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15627 1726882501.44683: no more pending results, returning what we have 15627 1726882501.44686: results queue empty 15627 1726882501.44687: checking for any_errors_fatal 15627 1726882501.44689: done checking for any_errors_fatal 15627 1726882501.44690: checking for max_fail_percentage 15627 1726882501.44692: done checking for max_fail_percentage 15627 1726882501.44693: checking to see if all hosts have failed and the running result is not ok 15627 1726882501.44694: done checking to see if all hosts have failed 15627 1726882501.44694: getting the remaining hosts for this loop 15627 1726882501.44696: done getting the remaining hosts for this loop 15627 1726882501.44699: getting the next task for host managed_node1 15627 1726882501.44708: done getting next task for host managed_node1 15627 1726882501.44711: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15627 1726882501.44714: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882501.44719: getting variables 15627 1726882501.44720: in VariableManager get_vars() 15627 1726882501.44749: Calling all_inventory to load vars for managed_node1 15627 1726882501.44752: Calling groups_inventory to load vars for managed_node1 15627 1726882501.44758: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882501.44770: Calling all_plugins_play to load vars for managed_node1 15627 1726882501.44774: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882501.44777: Calling groups_plugins_play to load vars for managed_node1 15627 1726882501.46321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882501.48059: done with get_vars() 15627 1726882501.48082: done getting variables 15627 1726882501.48141: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 15627 1726882501.48259: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:35:01 -0400 (0:00:00.366) 0:00:41.234 ****** 15627 1726882501.48290: entering _queue_task() for managed_node1/assert 15627 1726882501.48588: worker is 1 (out of 1 available) 15627 1726882501.48600: exiting _queue_task() for managed_node1/assert 15627 1726882501.48612: done queuing things up, now waiting for results queue to drain 15627 1726882501.48613: waiting for pending results... 15627 1726882501.48898: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15627 1726882501.49008: in run() - task 0e448fcc-3ce9-2847-7723-0000000004d7 15627 1726882501.49026: variable 'ansible_search_path' from source: unknown 15627 1726882501.49033: variable 'ansible_search_path' from source: unknown 15627 1726882501.49079: calling self._execute() 15627 1726882501.49173: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882501.49183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882501.49195: variable 'omit' from source: magic vars 15627 1726882501.49556: variable 'ansible_distribution_major_version' from source: facts 15627 1726882501.49577: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882501.49588: variable 'omit' from source: magic vars 15627 1726882501.49629: variable 'omit' from source: magic vars 15627 1726882501.49733: variable 'interface' from source: set_fact 15627 1726882501.49759: variable 'omit' from source: magic vars 15627 1726882501.49804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882501.49847: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882501.49878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882501.49900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882501.49918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882501.49952: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882501.49970: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882501.49980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882501.50089: Set connection var ansible_timeout to 10 15627 1726882501.50103: Set connection var ansible_shell_executable to /bin/sh 15627 1726882501.50112: Set connection var ansible_connection to ssh 15627 1726882501.50121: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882501.50129: Set connection var ansible_pipelining to False 15627 1726882501.50136: Set connection var ansible_shell_type to sh 15627 1726882501.50170: variable 'ansible_shell_executable' from source: unknown 15627 1726882501.50178: variable 'ansible_connection' from source: unknown 15627 1726882501.50185: variable 'ansible_module_compression' from source: unknown 15627 1726882501.50191: variable 'ansible_shell_type' from source: unknown 15627 1726882501.50198: variable 'ansible_shell_executable' from source: unknown 15627 1726882501.50204: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882501.50211: variable 'ansible_pipelining' from source: unknown 15627 1726882501.50218: variable 'ansible_timeout' from source: unknown 15627 1726882501.50226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882501.50377: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882501.50392: variable 'omit' from source: magic vars 15627 1726882501.50402: starting attempt loop 15627 1726882501.50408: running the handler 15627 1726882501.50553: variable 'interface_stat' from source: set_fact 15627 1726882501.50572: Evaluated conditional (not interface_stat.stat.exists): True 15627 1726882501.50585: handler run complete 15627 1726882501.50604: attempt loop complete, returning result 15627 1726882501.50611: _execute() done 15627 1726882501.50618: dumping result to json 15627 1726882501.50625: done dumping result, returning 15627 1726882501.50637: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' [0e448fcc-3ce9-2847-7723-0000000004d7] 15627 1726882501.50647: sending task result for task 0e448fcc-3ce9-2847-7723-0000000004d7 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15627 1726882501.50791: no more pending results, returning what we have 15627 1726882501.50794: results queue empty 15627 1726882501.50795: checking for any_errors_fatal 15627 1726882501.50804: done checking for any_errors_fatal 15627 1726882501.50805: checking for max_fail_percentage 15627 1726882501.50807: done checking for max_fail_percentage 15627 1726882501.50808: checking to see if all hosts have failed and the running result is not ok 15627 1726882501.50809: done checking to see if all hosts have failed 15627 1726882501.50809: getting the remaining hosts for this loop 15627 1726882501.50811: done getting the remaining hosts for this loop 15627 1726882501.50815: getting the next task for host managed_node1 15627 1726882501.50825: done getting next task for host managed_node1 15627 1726882501.50827: ^ task is: TASK: meta (flush_handlers) 15627 1726882501.50829: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882501.50833: getting variables 15627 1726882501.50835: in VariableManager get_vars() 15627 1726882501.50867: Calling all_inventory to load vars for managed_node1 15627 1726882501.50870: Calling groups_inventory to load vars for managed_node1 15627 1726882501.50874: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882501.50884: Calling all_plugins_play to load vars for managed_node1 15627 1726882501.50888: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882501.50891: Calling groups_plugins_play to load vars for managed_node1 15627 1726882501.51882: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000004d7 15627 1726882501.51886: WORKER PROCESS EXITING 15627 1726882501.52605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882501.54479: done with get_vars() 15627 1726882501.54501: done getting variables 15627 1726882501.54568: in VariableManager get_vars() 15627 1726882501.54577: Calling all_inventory to load vars for managed_node1 15627 1726882501.54579: Calling groups_inventory to load vars for managed_node1 15627 1726882501.54582: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882501.54587: Calling all_plugins_play to load vars for managed_node1 15627 1726882501.54589: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882501.54591: Calling groups_plugins_play to load vars for managed_node1 15627 1726882501.55830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882501.57543: done with get_vars() 15627 1726882501.57575: done queuing things up, now waiting for results queue to drain 15627 1726882501.57578: results queue empty 15627 1726882501.57579: checking for any_errors_fatal 15627 1726882501.57581: done checking for any_errors_fatal 15627 1726882501.57582: checking for max_fail_percentage 15627 1726882501.57583: done checking for max_fail_percentage 15627 1726882501.57584: checking to see if all hosts have failed and the running result is not ok 15627 1726882501.57585: done checking to see if all hosts have failed 15627 1726882501.57591: getting the remaining hosts for this loop 15627 1726882501.57592: done getting the remaining hosts for this loop 15627 1726882501.57595: getting the next task for host managed_node1 15627 1726882501.57599: done getting next task for host managed_node1 15627 1726882501.57600: ^ task is: TASK: meta (flush_handlers) 15627 1726882501.57601: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882501.57605: getting variables 15627 1726882501.57605: in VariableManager get_vars() 15627 1726882501.57614: Calling all_inventory to load vars for managed_node1 15627 1726882501.57616: Calling groups_inventory to load vars for managed_node1 15627 1726882501.57619: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882501.57624: Calling all_plugins_play to load vars for managed_node1 15627 1726882501.57626: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882501.57629: Calling groups_plugins_play to load vars for managed_node1 15627 1726882501.62642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882501.64293: done with get_vars() 15627 1726882501.64313: done getting variables 15627 1726882501.64361: in VariableManager get_vars() 15627 1726882501.64371: Calling all_inventory to load vars for managed_node1 15627 1726882501.64373: Calling groups_inventory to load vars for managed_node1 15627 1726882501.64375: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882501.64379: Calling all_plugins_play to load vars for managed_node1 15627 1726882501.64381: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882501.64384: Calling groups_plugins_play to load vars for managed_node1 15627 1726882501.65559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882501.67208: done with get_vars() 15627 1726882501.67232: done queuing things up, now waiting for results queue to drain 15627 1726882501.67235: results queue empty 15627 1726882501.67235: checking for any_errors_fatal 15627 1726882501.67237: done checking for any_errors_fatal 15627 1726882501.67238: checking for max_fail_percentage 15627 1726882501.67239: done checking for max_fail_percentage 15627 1726882501.67239: checking to see if all hosts have failed and the running result is not ok 15627 1726882501.67240: done checking to see if all hosts have failed 15627 1726882501.67241: getting the remaining hosts for this loop 15627 1726882501.67242: done getting the remaining hosts for this loop 15627 1726882501.67244: getting the next task for host managed_node1 15627 1726882501.67247: done getting next task for host managed_node1 15627 1726882501.67248: ^ task is: None 15627 1726882501.67249: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882501.67250: done queuing things up, now waiting for results queue to drain 15627 1726882501.67251: results queue empty 15627 1726882501.67252: checking for any_errors_fatal 15627 1726882501.67253: done checking for any_errors_fatal 15627 1726882501.67253: checking for max_fail_percentage 15627 1726882501.67257: done checking for max_fail_percentage 15627 1726882501.67257: checking to see if all hosts have failed and the running result is not ok 15627 1726882501.67258: done checking to see if all hosts have failed 15627 1726882501.67259: getting the next task for host managed_node1 15627 1726882501.67261: done getting next task for host managed_node1 15627 1726882501.67262: ^ task is: None 15627 1726882501.67264: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882501.67290: in VariableManager get_vars() 15627 1726882501.67302: done with get_vars() 15627 1726882501.67307: in VariableManager get_vars() 15627 1726882501.67315: done with get_vars() 15627 1726882501.67319: variable 'omit' from source: magic vars 15627 1726882501.67344: in VariableManager get_vars() 15627 1726882501.67352: done with get_vars() 15627 1726882501.67376: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 15627 1726882501.67582: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15627 1726882501.67603: getting the remaining hosts for this loop 15627 1726882501.67604: done getting the remaining hosts for this loop 15627 1726882501.67607: getting the next task for host managed_node1 15627 1726882501.67609: done getting next task for host managed_node1 15627 1726882501.67611: ^ task is: TASK: Gathering Facts 15627 1726882501.67612: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882501.67614: getting variables 15627 1726882501.67615: in VariableManager get_vars() 15627 1726882501.67623: Calling all_inventory to load vars for managed_node1 15627 1726882501.67625: Calling groups_inventory to load vars for managed_node1 15627 1726882501.67628: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882501.67633: Calling all_plugins_play to load vars for managed_node1 15627 1726882501.67635: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882501.67638: Calling groups_plugins_play to load vars for managed_node1 15627 1726882501.68524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882501.69437: done with get_vars() 15627 1726882501.69449: done getting variables 15627 1726882501.69481: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Friday 20 September 2024 21:35:01 -0400 (0:00:00.212) 0:00:41.446 ****** 15627 1726882501.69497: entering _queue_task() for managed_node1/gather_facts 15627 1726882501.69718: worker is 1 (out of 1 available) 15627 1726882501.69729: exiting _queue_task() for managed_node1/gather_facts 15627 1726882501.69741: done queuing things up, now waiting for results queue to drain 15627 1726882501.69743: waiting for pending results... 15627 1726882501.69946: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15627 1726882501.70030: in run() - task 0e448fcc-3ce9-2847-7723-0000000004fa 15627 1726882501.70041: variable 'ansible_search_path' from source: unknown 15627 1726882501.70078: calling self._execute() 15627 1726882501.70158: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882501.70499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882501.70503: variable 'omit' from source: magic vars 15627 1726882501.70555: variable 'ansible_distribution_major_version' from source: facts 15627 1726882501.70574: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882501.70584: variable 'omit' from source: magic vars 15627 1726882501.70612: variable 'omit' from source: magic vars 15627 1726882501.70652: variable 'omit' from source: magic vars 15627 1726882501.70700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882501.70744: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882501.70771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882501.70797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882501.70816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882501.70849: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882501.70857: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882501.70866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882501.70964: Set connection var ansible_timeout to 10 15627 1726882501.70979: Set connection var ansible_shell_executable to /bin/sh 15627 1726882501.70991: Set connection var ansible_connection to ssh 15627 1726882501.71001: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882501.71010: Set connection var ansible_pipelining to False 15627 1726882501.71016: Set connection var ansible_shell_type to sh 15627 1726882501.71045: variable 'ansible_shell_executable' from source: unknown 15627 1726882501.71052: variable 'ansible_connection' from source: unknown 15627 1726882501.71059: variable 'ansible_module_compression' from source: unknown 15627 1726882501.71067: variable 'ansible_shell_type' from source: unknown 15627 1726882501.71073: variable 'ansible_shell_executable' from source: unknown 15627 1726882501.71080: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882501.71086: variable 'ansible_pipelining' from source: unknown 15627 1726882501.71092: variable 'ansible_timeout' from source: unknown 15627 1726882501.71104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882501.71278: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882501.71293: variable 'omit' from source: magic vars 15627 1726882501.71302: starting attempt loop 15627 1726882501.71307: running the handler 15627 1726882501.71325: variable 'ansible_facts' from source: unknown 15627 1726882501.71351: _low_level_execute_command(): starting 15627 1726882501.71362: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882501.72099: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882501.72116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882501.72131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.72159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.72198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.72202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.72205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.72261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882501.72269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882501.72272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882501.72386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882501.74053: stdout chunk (state=3): >>>/root <<< 15627 1726882501.74154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882501.74207: stderr chunk (state=3): >>><<< 15627 1726882501.74210: stdout chunk (state=3): >>><<< 15627 1726882501.74269: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882501.74273: _low_level_execute_command(): starting 15627 1726882501.74277: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882501.7422867-17385-209533140052396 `" && echo ansible-tmp-1726882501.7422867-17385-209533140052396="` echo /root/.ansible/tmp/ansible-tmp-1726882501.7422867-17385-209533140052396 `" ) && sleep 0' 15627 1726882501.74669: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882501.74690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882501.74694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.74696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.74729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.74732: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.74734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.74783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882501.74798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882501.74809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882501.74908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882501.76775: stdout chunk (state=3): >>>ansible-tmp-1726882501.7422867-17385-209533140052396=/root/.ansible/tmp/ansible-tmp-1726882501.7422867-17385-209533140052396 <<< 15627 1726882501.76884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882501.76923: stderr chunk (state=3): >>><<< 15627 1726882501.76928: stdout chunk (state=3): >>><<< 15627 1726882501.76943: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882501.7422867-17385-209533140052396=/root/.ansible/tmp/ansible-tmp-1726882501.7422867-17385-209533140052396 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882501.76967: variable 'ansible_module_compression' from source: unknown 15627 1726882501.77005: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15627 1726882501.77058: variable 'ansible_facts' from source: unknown 15627 1726882501.77174: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882501.7422867-17385-209533140052396/AnsiballZ_setup.py 15627 1726882501.77278: Sending initial data 15627 1726882501.77288: Sent initial data (154 bytes) 15627 1726882501.77916: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.77921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.77958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.77961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.77966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882501.77968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.78014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882501.78018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882501.78125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882501.79895: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882501.79997: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 15627 1726882501.80026: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 15627 1726882501.80052: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 15627 1726882501.80174: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmpenxzcreq /root/.ansible/tmp/ansible-tmp-1726882501.7422867-17385-209533140052396/AnsiballZ_setup.py <<< 15627 1726882501.80271: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882501.82824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882501.82963: stderr chunk (state=3): >>><<< 15627 1726882501.82969: stdout chunk (state=3): >>><<< 15627 1726882501.82971: done transferring module to remote 15627 1726882501.82973: _low_level_execute_command(): starting 15627 1726882501.82976: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882501.7422867-17385-209533140052396/ /root/.ansible/tmp/ansible-tmp-1726882501.7422867-17385-209533140052396/AnsiballZ_setup.py && sleep 0' 15627 1726882501.83450: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882501.83456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.83490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882501.83494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.83496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.83541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882501.83553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882501.83578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882501.83683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882501.85505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882501.85561: stderr chunk (state=3): >>><<< 15627 1726882501.85566: stdout chunk (state=3): >>><<< 15627 1726882501.85581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882501.85584: _low_level_execute_command(): starting 15627 1726882501.85589: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882501.7422867-17385-209533140052396/AnsiballZ_setup.py && sleep 0' 15627 1726882501.86329: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882501.86350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882501.86352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882501.86372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882501.86424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882501.86427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882501.86430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882501.86432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882501.86490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882501.86496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882501.86603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882502.37721: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_loadavg": {"1m": 0.5, "5m": 0.39, "15m": 0.2}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "02", "epoch": "1726882502", "epoch_int": "1726882502", "date": "2024-09-20", "time": "21:35:02", "iso8601_micro": "2024-09-21T01:35:02.116762Z", "iso8601": "2024-09-21T01:35:02Z", "iso8601_basic": "20240920T213502116762", "iso8601_basic_short": "20240920T213502", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2813, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 719, "free": 2813}, "nocache": {"free": 3274, "used": 258}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 660, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241389568, "block_size": 4096, "block_total": 65519355, "block_available": 64512058, "block_used": 1007297, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15627 1726882502.39295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882502.39370: stderr chunk (state=3): >>><<< 15627 1726882502.39374: stdout chunk (state=3): >>><<< 15627 1726882502.39479: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_loadavg": {"1m": 0.5, "5m": 0.39, "15m": 0.2}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "02", "epoch": "1726882502", "epoch_int": "1726882502", "date": "2024-09-20", "time": "21:35:02", "iso8601_micro": "2024-09-21T01:35:02.116762Z", "iso8601": "2024-09-21T01:35:02Z", "iso8601_basic": "20240920T213502116762", "iso8601_basic_short": "20240920T213502", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2813, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 719, "free": 2813}, "nocache": {"free": 3274, "used": 258}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 660, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264241389568, "block_size": 4096, "block_total": 65519355, "block_available": 64512058, "block_used": 1007297, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882502.39843: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882501.7422867-17385-209533140052396/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882502.39875: _low_level_execute_command(): starting 15627 1726882502.39892: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882501.7422867-17385-209533140052396/ > /dev/null 2>&1 && sleep 0' 15627 1726882502.40585: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882502.40598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882502.40611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882502.40628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882502.40689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882502.40701: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882502.40713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882502.40728: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882502.40739: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882502.40748: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882502.40771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882502.40785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882502.40799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882502.40810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882502.40820: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882502.40831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882502.40918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882502.40938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882502.40952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882502.41084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882502.42971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882502.42974: stdout chunk (state=3): >>><<< 15627 1726882502.42977: stderr chunk (state=3): >>><<< 15627 1726882502.43381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882502.43385: handler run complete 15627 1726882502.43388: variable 'ansible_facts' from source: unknown 15627 1726882502.43390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882502.43992: variable 'ansible_facts' from source: unknown 15627 1726882502.44090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882502.44193: attempt loop complete, returning result 15627 1726882502.44202: _execute() done 15627 1726882502.44209: dumping result to json 15627 1726882502.44243: done dumping result, returning 15627 1726882502.44261: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-2847-7723-0000000004fa] 15627 1726882502.44275: sending task result for task 0e448fcc-3ce9-2847-7723-0000000004fa ok: [managed_node1] 15627 1726882502.44944: no more pending results, returning what we have 15627 1726882502.44947: results queue empty 15627 1726882502.44948: checking for any_errors_fatal 15627 1726882502.44949: done checking for any_errors_fatal 15627 1726882502.44950: checking for max_fail_percentage 15627 1726882502.44952: done checking for max_fail_percentage 15627 1726882502.44952: checking to see if all hosts have failed and the running result is not ok 15627 1726882502.44956: done checking to see if all hosts have failed 15627 1726882502.44957: getting the remaining hosts for this loop 15627 1726882502.44958: done getting the remaining hosts for this loop 15627 1726882502.44962: getting the next task for host managed_node1 15627 1726882502.44971: done getting next task for host managed_node1 15627 1726882502.44973: ^ task is: TASK: meta (flush_handlers) 15627 1726882502.44975: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882502.44978: getting variables 15627 1726882502.44980: in VariableManager get_vars() 15627 1726882502.45006: Calling all_inventory to load vars for managed_node1 15627 1726882502.45008: Calling groups_inventory to load vars for managed_node1 15627 1726882502.45012: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882502.45023: Calling all_plugins_play to load vars for managed_node1 15627 1726882502.45026: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882502.45029: Calling groups_plugins_play to load vars for managed_node1 15627 1726882502.45883: done sending task result for task 0e448fcc-3ce9-2847-7723-0000000004fa 15627 1726882502.45887: WORKER PROCESS EXITING 15627 1726882502.46970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882502.48770: done with get_vars() 15627 1726882502.48793: done getting variables 15627 1726882502.48865: in VariableManager get_vars() 15627 1726882502.48875: Calling all_inventory to load vars for managed_node1 15627 1726882502.48878: Calling groups_inventory to load vars for managed_node1 15627 1726882502.48880: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882502.48885: Calling all_plugins_play to load vars for managed_node1 15627 1726882502.48887: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882502.48894: Calling groups_plugins_play to load vars for managed_node1 15627 1726882502.50648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882502.53089: done with get_vars() 15627 1726882502.53128: done queuing things up, now waiting for results queue to drain 15627 1726882502.53131: results queue empty 15627 1726882502.53131: checking for any_errors_fatal 15627 1726882502.53136: done checking for any_errors_fatal 15627 1726882502.53137: checking for max_fail_percentage 15627 1726882502.53138: done checking for max_fail_percentage 15627 1726882502.53138: checking to see if all hosts have failed and the running result is not ok 15627 1726882502.53139: done checking to see if all hosts have failed 15627 1726882502.53140: getting the remaining hosts for this loop 15627 1726882502.53141: done getting the remaining hosts for this loop 15627 1726882502.53143: getting the next task for host managed_node1 15627 1726882502.53148: done getting next task for host managed_node1 15627 1726882502.53150: ^ task is: TASK: Verify network state restored to default 15627 1726882502.53151: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882502.53153: getting variables 15627 1726882502.53154: in VariableManager get_vars() 15627 1726882502.53166: Calling all_inventory to load vars for managed_node1 15627 1726882502.53168: Calling groups_inventory to load vars for managed_node1 15627 1726882502.53170: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882502.53177: Calling all_plugins_play to load vars for managed_node1 15627 1726882502.53180: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882502.53182: Calling groups_plugins_play to load vars for managed_node1 15627 1726882502.54577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882502.56347: done with get_vars() 15627 1726882502.56372: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:67 Friday 20 September 2024 21:35:02 -0400 (0:00:00.869) 0:00:42.316 ****** 15627 1726882502.56444: entering _queue_task() for managed_node1/include_tasks 15627 1726882502.56776: worker is 1 (out of 1 available) 15627 1726882502.56788: exiting _queue_task() for managed_node1/include_tasks 15627 1726882502.56803: done queuing things up, now waiting for results queue to drain 15627 1726882502.56804: waiting for pending results... 15627 1726882502.57086: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 15627 1726882502.57205: in run() - task 0e448fcc-3ce9-2847-7723-00000000007a 15627 1726882502.57230: variable 'ansible_search_path' from source: unknown 15627 1726882502.57278: calling self._execute() 15627 1726882502.57375: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882502.57387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882502.57399: variable 'omit' from source: magic vars 15627 1726882502.57827: variable 'ansible_distribution_major_version' from source: facts 15627 1726882502.57846: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882502.57858: _execute() done 15627 1726882502.57869: dumping result to json 15627 1726882502.57877: done dumping result, returning 15627 1726882502.57906: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0e448fcc-3ce9-2847-7723-00000000007a] 15627 1726882502.57923: sending task result for task 0e448fcc-3ce9-2847-7723-00000000007a 15627 1726882502.58062: no more pending results, returning what we have 15627 1726882502.58069: in VariableManager get_vars() 15627 1726882502.58104: Calling all_inventory to load vars for managed_node1 15627 1726882502.58107: Calling groups_inventory to load vars for managed_node1 15627 1726882502.58112: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882502.58127: Calling all_plugins_play to load vars for managed_node1 15627 1726882502.58130: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882502.58133: Calling groups_plugins_play to load vars for managed_node1 15627 1726882502.59910: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000007a 15627 1726882502.59914: WORKER PROCESS EXITING 15627 1726882502.62617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882502.64634: done with get_vars() 15627 1726882502.64655: variable 'ansible_search_path' from source: unknown 15627 1726882502.64674: we have included files to process 15627 1726882502.64676: generating all_blocks data 15627 1726882502.64677: done generating all_blocks data 15627 1726882502.64678: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15627 1726882502.64679: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15627 1726882502.64681: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15627 1726882502.65094: done processing included file 15627 1726882502.65096: iterating over new_blocks loaded from include file 15627 1726882502.65097: in VariableManager get_vars() 15627 1726882502.65109: done with get_vars() 15627 1726882502.65111: filtering new block on tags 15627 1726882502.65128: done filtering new block on tags 15627 1726882502.65130: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 15627 1726882502.65135: extending task lists for all hosts with included blocks 15627 1726882502.65168: done extending task lists 15627 1726882502.65170: done processing included files 15627 1726882502.65170: results queue empty 15627 1726882502.65171: checking for any_errors_fatal 15627 1726882502.65173: done checking for any_errors_fatal 15627 1726882502.65174: checking for max_fail_percentage 15627 1726882502.65175: done checking for max_fail_percentage 15627 1726882502.65175: checking to see if all hosts have failed and the running result is not ok 15627 1726882502.65176: done checking to see if all hosts have failed 15627 1726882502.65177: getting the remaining hosts for this loop 15627 1726882502.65178: done getting the remaining hosts for this loop 15627 1726882502.65181: getting the next task for host managed_node1 15627 1726882502.65185: done getting next task for host managed_node1 15627 1726882502.65188: ^ task is: TASK: Check routes and DNS 15627 1726882502.65190: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882502.65193: getting variables 15627 1726882502.65194: in VariableManager get_vars() 15627 1726882502.65202: Calling all_inventory to load vars for managed_node1 15627 1726882502.65204: Calling groups_inventory to load vars for managed_node1 15627 1726882502.65207: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882502.65212: Calling all_plugins_play to load vars for managed_node1 15627 1726882502.65215: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882502.65218: Calling groups_plugins_play to load vars for managed_node1 15627 1726882502.68073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882502.70140: done with get_vars() 15627 1726882502.70185: done getting variables 15627 1726882502.70250: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:35:02 -0400 (0:00:00.138) 0:00:42.455 ****** 15627 1726882502.70385: entering _queue_task() for managed_node1/shell 15627 1726882502.70909: worker is 1 (out of 1 available) 15627 1726882502.70923: exiting _queue_task() for managed_node1/shell 15627 1726882502.70936: done queuing things up, now waiting for results queue to drain 15627 1726882502.70937: waiting for pending results... 15627 1726882502.71593: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 15627 1726882502.71839: in run() - task 0e448fcc-3ce9-2847-7723-00000000050b 15627 1726882502.71865: variable 'ansible_search_path' from source: unknown 15627 1726882502.71875: variable 'ansible_search_path' from source: unknown 15627 1726882502.72088: calling self._execute() 15627 1726882502.72318: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882502.72332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882502.72424: variable 'omit' from source: magic vars 15627 1726882502.72999: variable 'ansible_distribution_major_version' from source: facts 15627 1726882502.73085: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882502.73098: variable 'omit' from source: magic vars 15627 1726882502.73160: variable 'omit' from source: magic vars 15627 1726882502.73268: variable 'omit' from source: magic vars 15627 1726882502.73311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882502.73348: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882502.73396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882502.73422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882502.73436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882502.73469: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882502.73477: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882502.73483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882502.74045: Set connection var ansible_timeout to 10 15627 1726882502.74233: Set connection var ansible_shell_executable to /bin/sh 15627 1726882502.74354: Set connection var ansible_connection to ssh 15627 1726882502.74368: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882502.74386: Set connection var ansible_pipelining to False 15627 1726882502.74410: Set connection var ansible_shell_type to sh 15627 1726882502.74503: variable 'ansible_shell_executable' from source: unknown 15627 1726882502.74511: variable 'ansible_connection' from source: unknown 15627 1726882502.74647: variable 'ansible_module_compression' from source: unknown 15627 1726882502.74668: variable 'ansible_shell_type' from source: unknown 15627 1726882502.74693: variable 'ansible_shell_executable' from source: unknown 15627 1726882502.74701: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882502.74739: variable 'ansible_pipelining' from source: unknown 15627 1726882502.74747: variable 'ansible_timeout' from source: unknown 15627 1726882502.74783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882502.75085: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882502.75103: variable 'omit' from source: magic vars 15627 1726882502.75115: starting attempt loop 15627 1726882502.75122: running the handler 15627 1726882502.75138: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882502.75163: _low_level_execute_command(): starting 15627 1726882502.75178: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882502.76386: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882502.76460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882502.76477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882502.76494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882502.76539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882502.76553: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882502.76570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882502.76635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882502.76653: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882502.76683: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882502.76696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882502.76740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882502.76757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882502.76772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882502.76783: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882502.76819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882502.76962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882502.76980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882502.77008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882502.77188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882502.78837: stdout chunk (state=3): >>>/root <<< 15627 1726882502.78956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882502.79062: stderr chunk (state=3): >>><<< 15627 1726882502.79084: stdout chunk (state=3): >>><<< 15627 1726882502.79151: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882502.79156: _low_level_execute_command(): starting 15627 1726882502.79169: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882502.7911348-17416-253374358157623 `" && echo ansible-tmp-1726882502.7911348-17416-253374358157623="` echo /root/.ansible/tmp/ansible-tmp-1726882502.7911348-17416-253374358157623 `" ) && sleep 0' 15627 1726882502.79991: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882502.80006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882502.80030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882502.80049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882502.80094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882502.80107: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882502.80124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882502.80146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882502.80160: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882502.80174: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882502.80187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882502.80201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882502.80217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882502.80230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882502.80244: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882502.80310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882502.80416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882502.80438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882502.80488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882502.80650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882502.82488: stdout chunk (state=3): >>>ansible-tmp-1726882502.7911348-17416-253374358157623=/root/.ansible/tmp/ansible-tmp-1726882502.7911348-17416-253374358157623 <<< 15627 1726882502.82623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882502.82883: stderr chunk (state=3): >>><<< 15627 1726882502.82894: stdout chunk (state=3): >>><<< 15627 1726882502.82939: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882502.7911348-17416-253374358157623=/root/.ansible/tmp/ansible-tmp-1726882502.7911348-17416-253374358157623 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882502.83035: variable 'ansible_module_compression' from source: unknown 15627 1726882502.83124: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15627 1726882502.83173: variable 'ansible_facts' from source: unknown 15627 1726882502.83239: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882502.7911348-17416-253374358157623/AnsiballZ_command.py 15627 1726882502.83484: Sending initial data 15627 1726882502.83487: Sent initial data (156 bytes) 15627 1726882502.84994: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882502.85009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882502.85023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882502.85039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882502.85090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882502.85104: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882502.85122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882502.85142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882502.85156: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882502.85173: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882502.85193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882502.85209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882502.85231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882502.85244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882502.85255: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882502.85272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882502.85345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882502.85367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882502.85381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882502.85500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882502.87222: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882502.87318: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882502.87410: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmplvw02zz6 /root/.ansible/tmp/ansible-tmp-1726882502.7911348-17416-253374358157623/AnsiballZ_command.py <<< 15627 1726882502.87513: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882502.88895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882502.89010: stderr chunk (state=3): >>><<< 15627 1726882502.89013: stdout chunk (state=3): >>><<< 15627 1726882502.89016: done transferring module to remote 15627 1726882502.89018: _low_level_execute_command(): starting 15627 1726882502.89020: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882502.7911348-17416-253374358157623/ /root/.ansible/tmp/ansible-tmp-1726882502.7911348-17416-253374358157623/AnsiballZ_command.py && sleep 0' 15627 1726882502.89596: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882502.89608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882502.89620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882502.89634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882502.89677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882502.89690: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882502.89702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882502.89716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882502.89725: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882502.89734: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882502.89743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882502.89753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882502.89769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882502.89781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882502.89793: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882502.89804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882502.89882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882502.89900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882502.89915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882502.90038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882502.91809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882502.91812: stdout chunk (state=3): >>><<< 15627 1726882502.91819: stderr chunk (state=3): >>><<< 15627 1726882502.91837: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882502.91840: _low_level_execute_command(): starting 15627 1726882502.91845: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882502.7911348-17416-253374358157623/AnsiballZ_command.py && sleep 0' 15627 1726882502.92513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882502.92523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882502.92534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882502.92547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882502.92590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882502.92600: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882502.92610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882502.92622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882502.92630: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882502.92636: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882502.92643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882502.92652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882502.92672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882502.92680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882502.92686: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882502.92695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882502.92774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882502.92791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882502.92802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882502.92931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882503.06880: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2968sec preferred_lft 2968sec\n inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:35:03.058964", "end": "2024-09-20 21:35:03.067169", "delta": "0:00:00.008205", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15627 1726882503.08085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882503.08090: stdout chunk (state=3): >>><<< 15627 1726882503.08093: stderr chunk (state=3): >>><<< 15627 1726882503.08277: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2968sec preferred_lft 2968sec\n inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:35:03.058964", "end": "2024-09-20 21:35:03.067169", "delta": "0:00:00.008205", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882503.08281: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882502.7911348-17416-253374358157623/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882503.08285: _low_level_execute_command(): starting 15627 1726882503.08287: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882502.7911348-17416-253374358157623/ > /dev/null 2>&1 && sleep 0' 15627 1726882503.09003: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882503.09037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882503.09068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882503.09085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882503.09166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882503.09187: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882503.09195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882503.09225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882503.09236: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882503.09238: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882503.09259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882503.09262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882503.09275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882503.09283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882503.09290: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882503.09300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882503.09373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882503.09409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882503.09449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882503.09547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882503.11321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882503.11415: stderr chunk (state=3): >>><<< 15627 1726882503.11426: stdout chunk (state=3): >>><<< 15627 1726882503.11470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882503.11477: handler run complete 15627 1726882503.11571: Evaluated conditional (False): False 15627 1726882503.11575: attempt loop complete, returning result 15627 1726882503.11577: _execute() done 15627 1726882503.11579: dumping result to json 15627 1726882503.11581: done dumping result, returning 15627 1726882503.11583: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0e448fcc-3ce9-2847-7723-00000000050b] 15627 1726882503.11585: sending task result for task 0e448fcc-3ce9-2847-7723-00000000050b 15627 1726882503.11748: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000050b 15627 1726882503.11752: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008205", "end": "2024-09-20 21:35:03.067169", "rc": 0, "start": "2024-09-20 21:35:03.058964" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 2968sec preferred_lft 2968sec inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 15627 1726882503.11833: no more pending results, returning what we have 15627 1726882503.11837: results queue empty 15627 1726882503.11839: checking for any_errors_fatal 15627 1726882503.11841: done checking for any_errors_fatal 15627 1726882503.11842: checking for max_fail_percentage 15627 1726882503.11844: done checking for max_fail_percentage 15627 1726882503.11845: checking to see if all hosts have failed and the running result is not ok 15627 1726882503.11846: done checking to see if all hosts have failed 15627 1726882503.11847: getting the remaining hosts for this loop 15627 1726882503.11848: done getting the remaining hosts for this loop 15627 1726882503.11852: getting the next task for host managed_node1 15627 1726882503.11861: done getting next task for host managed_node1 15627 1726882503.11866: ^ task is: TASK: Verify DNS and network connectivity 15627 1726882503.11869: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882503.11874: getting variables 15627 1726882503.11876: in VariableManager get_vars() 15627 1726882503.11909: Calling all_inventory to load vars for managed_node1 15627 1726882503.11912: Calling groups_inventory to load vars for managed_node1 15627 1726882503.11916: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882503.11930: Calling all_plugins_play to load vars for managed_node1 15627 1726882503.11933: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882503.11936: Calling groups_plugins_play to load vars for managed_node1 15627 1726882503.15097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882503.18821: done with get_vars() 15627 1726882503.18855: done getting variables 15627 1726882503.19028: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:35:03 -0400 (0:00:00.486) 0:00:42.942 ****** 15627 1726882503.19058: entering _queue_task() for managed_node1/shell 15627 1726882503.19730: worker is 1 (out of 1 available) 15627 1726882503.19858: exiting _queue_task() for managed_node1/shell 15627 1726882503.19963: done queuing things up, now waiting for results queue to drain 15627 1726882503.19967: waiting for pending results... 15627 1726882503.20442: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 15627 1726882503.21189: in run() - task 0e448fcc-3ce9-2847-7723-00000000050c 15627 1726882503.21213: variable 'ansible_search_path' from source: unknown 15627 1726882503.21222: variable 'ansible_search_path' from source: unknown 15627 1726882503.21272: calling self._execute() 15627 1726882503.21375: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882503.21386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882503.21400: variable 'omit' from source: magic vars 15627 1726882503.21803: variable 'ansible_distribution_major_version' from source: facts 15627 1726882503.22584: Evaluated conditional (ansible_distribution_major_version != '6'): True 15627 1726882503.22736: variable 'ansible_facts' from source: unknown 15627 1726882503.24248: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 15627 1726882503.24266: variable 'omit' from source: magic vars 15627 1726882503.24315: variable 'omit' from source: magic vars 15627 1726882503.24351: variable 'omit' from source: magic vars 15627 1726882503.24400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15627 1726882503.24442: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15627 1726882503.24473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15627 1726882503.24495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882503.24512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15627 1726882503.24550: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15627 1726882503.24636: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882503.24644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882503.24756: Set connection var ansible_timeout to 10 15627 1726882503.24776: Set connection var ansible_shell_executable to /bin/sh 15627 1726882503.24787: Set connection var ansible_connection to ssh 15627 1726882503.24796: Set connection var ansible_module_compression to ZIP_DEFLATED 15627 1726882503.24804: Set connection var ansible_pipelining to False 15627 1726882503.24810: Set connection var ansible_shell_type to sh 15627 1726882503.24840: variable 'ansible_shell_executable' from source: unknown 15627 1726882503.24849: variable 'ansible_connection' from source: unknown 15627 1726882503.24860: variable 'ansible_module_compression' from source: unknown 15627 1726882503.24869: variable 'ansible_shell_type' from source: unknown 15627 1726882503.24879: variable 'ansible_shell_executable' from source: unknown 15627 1726882503.24886: variable 'ansible_host' from source: host vars for 'managed_node1' 15627 1726882503.24893: variable 'ansible_pipelining' from source: unknown 15627 1726882503.24900: variable 'ansible_timeout' from source: unknown 15627 1726882503.24908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15627 1726882503.25062: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882503.25084: variable 'omit' from source: magic vars 15627 1726882503.25099: starting attempt loop 15627 1726882503.25106: running the handler 15627 1726882503.25127: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 15627 1726882503.25156: _low_level_execute_command(): starting 15627 1726882503.25174: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15627 1726882503.26648: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882503.26669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882503.26688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882503.26708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882503.26757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882503.26773: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882503.26788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882503.26808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882503.26821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882503.26838: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882503.26854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882503.26867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882503.26880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882503.26888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882503.26895: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882503.26905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882503.26982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882503.27001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882503.27013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882503.27140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882503.28783: stdout chunk (state=3): >>>/root <<< 15627 1726882503.28945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882503.28950: stdout chunk (state=3): >>><<< 15627 1726882503.28962: stderr chunk (state=3): >>><<< 15627 1726882503.29077: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882503.29088: _low_level_execute_command(): starting 15627 1726882503.29091: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882503.2898364-17437-126874903621468 `" && echo ansible-tmp-1726882503.2898364-17437-126874903621468="` echo /root/.ansible/tmp/ansible-tmp-1726882503.2898364-17437-126874903621468 `" ) && sleep 0' 15627 1726882503.31235: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882503.31285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882503.31326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882503.31329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882503.31331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882503.31508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882503.31520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882503.31634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882503.33496: stdout chunk (state=3): >>>ansible-tmp-1726882503.2898364-17437-126874903621468=/root/.ansible/tmp/ansible-tmp-1726882503.2898364-17437-126874903621468 <<< 15627 1726882503.33601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882503.33687: stderr chunk (state=3): >>><<< 15627 1726882503.33691: stdout chunk (state=3): >>><<< 15627 1726882503.33991: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882503.2898364-17437-126874903621468=/root/.ansible/tmp/ansible-tmp-1726882503.2898364-17437-126874903621468 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882503.33999: variable 'ansible_module_compression' from source: unknown 15627 1726882503.34002: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15627yb6z139m/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15627 1726882503.34004: variable 'ansible_facts' from source: unknown 15627 1726882503.34005: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882503.2898364-17437-126874903621468/AnsiballZ_command.py 15627 1726882503.34382: Sending initial data 15627 1726882503.34385: Sent initial data (156 bytes) 15627 1726882503.37009: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882503.37024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882503.37037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882503.37058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882503.37110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882503.37122: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882503.37134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882503.37150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882503.37168: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882503.37180: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882503.37191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882503.37203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882503.37217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882503.37793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882503.37806: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882503.37819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882503.37903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882503.37928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882503.37944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882503.38679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882503.39824: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15627 1726882503.39912: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 15627 1726882503.40006: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15627yb6z139m/tmp5nvv6vvh /root/.ansible/tmp/ansible-tmp-1726882503.2898364-17437-126874903621468/AnsiballZ_command.py <<< 15627 1726882503.40095: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 15627 1726882503.41571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882503.41575: stderr chunk (state=3): >>><<< 15627 1726882503.41580: stdout chunk (state=3): >>><<< 15627 1726882503.41697: done transferring module to remote 15627 1726882503.41701: _low_level_execute_command(): starting 15627 1726882503.41704: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882503.2898364-17437-126874903621468/ /root/.ansible/tmp/ansible-tmp-1726882503.2898364-17437-126874903621468/AnsiballZ_command.py && sleep 0' 15627 1726882503.43195: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882503.43199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882503.43351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 15627 1726882503.43357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882503.43359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 15627 1726882503.43362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882503.43415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882503.43448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882503.43451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882503.43561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882503.45281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882503.45349: stderr chunk (state=3): >>><<< 15627 1726882503.45352: stdout chunk (state=3): >>><<< 15627 1726882503.45445: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882503.45449: _low_level_execute_command(): starting 15627 1726882503.45451: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882503.2898364-17437-126874903621468/AnsiballZ_command.py && sleep 0' 15627 1726882503.46463: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15627 1726882503.46993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882503.47009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882503.47029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882503.47078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882503.47124: stderr chunk (state=3): >>>debug2: match not found <<< 15627 1726882503.47139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882503.47212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15627 1726882503.47225: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 15627 1726882503.47238: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15627 1726882503.47248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882503.47262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882503.47278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882503.47288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 15627 1726882503.47297: stderr chunk (state=3): >>>debug2: match found <<< 15627 1726882503.47307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882503.47376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 15627 1726882503.47645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882503.47665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882503.47926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882503.89139: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 12200 0 --:--:-- --:--:-- --:--:-- 12200\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1248 0 --:--:-- --:--:-- --:--:-- 1248", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:35:03.608004", "end": "2024-09-20 21:35:03.889768", "delta": "0:00:00.281764", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15627 1726882503.90488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 15627 1726882503.90526: stderr chunk (state=3): >>><<< 15627 1726882503.90529: stdout chunk (state=3): >>><<< 15627 1726882503.90670: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 12200 0 --:--:-- --:--:-- --:--:-- 12200\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1248 0 --:--:-- --:--:-- --:--:-- 1248", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:35:03.608004", "end": "2024-09-20 21:35:03.889768", "delta": "0:00:00.281764", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 15627 1726882503.90674: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882503.2898364-17437-126874903621468/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15627 1726882503.90676: _low_level_execute_command(): starting 15627 1726882503.90679: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882503.2898364-17437-126874903621468/ > /dev/null 2>&1 && sleep 0' 15627 1726882503.91945: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882503.91951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15627 1726882503.91992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882503.91998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15627 1726882503.92011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15627 1726882503.92017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15627 1726882503.92103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 15627 1726882503.92121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15627 1726882503.92236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15627 1726882503.94484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15627 1726882503.94487: stdout chunk (state=3): >>><<< 15627 1726882503.94489: stderr chunk (state=3): >>><<< 15627 1726882503.94491: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15627 1726882503.94495: handler run complete 15627 1726882503.94497: Evaluated conditional (False): False 15627 1726882503.94499: attempt loop complete, returning result 15627 1726882503.94500: _execute() done 15627 1726882503.94502: dumping result to json 15627 1726882503.94504: done dumping result, returning 15627 1726882503.94505: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-2847-7723-00000000050c] 15627 1726882503.94507: sending task result for task 0e448fcc-3ce9-2847-7723-00000000050c 15627 1726882503.94581: done sending task result for task 0e448fcc-3ce9-2847-7723-00000000050c 15627 1726882503.94583: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.281764", "end": "2024-09-20 21:35:03.889768", "rc": 0, "start": "2024-09-20 21:35:03.608004" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 12200 0 --:--:-- --:--:-- --:--:-- 12200 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1248 0 --:--:-- --:--:-- --:--:-- 1248 15627 1726882503.94818: no more pending results, returning what we have 15627 1726882503.94821: results queue empty 15627 1726882503.94822: checking for any_errors_fatal 15627 1726882503.94833: done checking for any_errors_fatal 15627 1726882503.94834: checking for max_fail_percentage 15627 1726882503.94836: done checking for max_fail_percentage 15627 1726882503.94836: checking to see if all hosts have failed and the running result is not ok 15627 1726882503.94837: done checking to see if all hosts have failed 15627 1726882503.94838: getting the remaining hosts for this loop 15627 1726882503.94840: done getting the remaining hosts for this loop 15627 1726882503.94843: getting the next task for host managed_node1 15627 1726882503.94851: done getting next task for host managed_node1 15627 1726882503.94853: ^ task is: TASK: meta (flush_handlers) 15627 1726882503.94855: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882503.94858: getting variables 15627 1726882503.94860: in VariableManager get_vars() 15627 1726882503.94892: Calling all_inventory to load vars for managed_node1 15627 1726882503.94895: Calling groups_inventory to load vars for managed_node1 15627 1726882503.94899: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882503.94909: Calling all_plugins_play to load vars for managed_node1 15627 1726882503.94911: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882503.94914: Calling groups_plugins_play to load vars for managed_node1 15627 1726882503.96779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882503.99267: done with get_vars() 15627 1726882503.99290: done getting variables 15627 1726882503.99359: in VariableManager get_vars() 15627 1726882503.99370: Calling all_inventory to load vars for managed_node1 15627 1726882503.99373: Calling groups_inventory to load vars for managed_node1 15627 1726882503.99375: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882503.99380: Calling all_plugins_play to load vars for managed_node1 15627 1726882503.99383: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882503.99385: Calling groups_plugins_play to load vars for managed_node1 15627 1726882504.00651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882504.03004: done with get_vars() 15627 1726882504.03033: done queuing things up, now waiting for results queue to drain 15627 1726882504.03035: results queue empty 15627 1726882504.03036: checking for any_errors_fatal 15627 1726882504.03040: done checking for any_errors_fatal 15627 1726882504.03041: checking for max_fail_percentage 15627 1726882504.03042: done checking for max_fail_percentage 15627 1726882504.03043: checking to see if all hosts have failed and the running result is not ok 15627 1726882504.03043: done checking to see if all hosts have failed 15627 1726882504.03044: getting the remaining hosts for this loop 15627 1726882504.03045: done getting the remaining hosts for this loop 15627 1726882504.03048: getting the next task for host managed_node1 15627 1726882504.03052: done getting next task for host managed_node1 15627 1726882504.03054: ^ task is: TASK: meta (flush_handlers) 15627 1726882504.03055: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882504.03058: getting variables 15627 1726882504.03059: in VariableManager get_vars() 15627 1726882504.03071: Calling all_inventory to load vars for managed_node1 15627 1726882504.03074: Calling groups_inventory to load vars for managed_node1 15627 1726882504.03076: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882504.03082: Calling all_plugins_play to load vars for managed_node1 15627 1726882504.03085: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882504.03087: Calling groups_plugins_play to load vars for managed_node1 15627 1726882504.04374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882504.07799: done with get_vars() 15627 1726882504.07984: done getting variables 15627 1726882504.08543: in VariableManager get_vars() 15627 1726882504.08553: Calling all_inventory to load vars for managed_node1 15627 1726882504.08555: Calling groups_inventory to load vars for managed_node1 15627 1726882504.08557: Calling all_plugins_inventory to load vars for managed_node1 15627 1726882504.08563: Calling all_plugins_play to load vars for managed_node1 15627 1726882504.08590: Calling groups_plugins_inventory to load vars for managed_node1 15627 1726882504.08594: Calling groups_plugins_play to load vars for managed_node1 15627 1726882504.10891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15627 1726882504.13979: done with get_vars() 15627 1726882504.14005: done queuing things up, now waiting for results queue to drain 15627 1726882504.14007: results queue empty 15627 1726882504.14007: checking for any_errors_fatal 15627 1726882504.14009: done checking for any_errors_fatal 15627 1726882504.14009: checking for max_fail_percentage 15627 1726882504.14010: done checking for max_fail_percentage 15627 1726882504.14011: checking to see if all hosts have failed and the running result is not ok 15627 1726882504.14012: done checking to see if all hosts have failed 15627 1726882504.14012: getting the remaining hosts for this loop 15627 1726882504.14013: done getting the remaining hosts for this loop 15627 1726882504.14016: getting the next task for host managed_node1 15627 1726882504.14019: done getting next task for host managed_node1 15627 1726882504.14020: ^ task is: None 15627 1726882504.14021: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15627 1726882504.14022: done queuing things up, now waiting for results queue to drain 15627 1726882504.14023: results queue empty 15627 1726882504.14024: checking for any_errors_fatal 15627 1726882504.14024: done checking for any_errors_fatal 15627 1726882504.14025: checking for max_fail_percentage 15627 1726882504.14026: done checking for max_fail_percentage 15627 1726882504.14026: checking to see if all hosts have failed and the running result is not ok 15627 1726882504.14027: done checking to see if all hosts have failed 15627 1726882504.14028: getting the next task for host managed_node1 15627 1726882504.14030: done getting next task for host managed_node1 15627 1726882504.14031: ^ task is: None 15627 1726882504.14032: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=82 changed=3 unreachable=0 failed=0 skipped=71 rescued=0 ignored=2 Friday 20 September 2024 21:35:04 -0400 (0:00:00.950) 0:00:43.892 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.75s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.69s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.59s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.44s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.20s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 1.13s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 0.96s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.96s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Verify DNS and network connectivity ------------------------------------- 0.95s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.93s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.89s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.88s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.87s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Gathering Facts --------------------------------------------------------- 0.87s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.86s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.86s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Check which packages are installed --- 0.85s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 15627 1726882504.14130: RUNNING CLEANUP