[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 28173 1726882746.30927: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 28173 1726882746.31250: Added group all to inventory 28173 1726882746.31252: Added group ungrouped to inventory 28173 1726882746.31255: Group all now contains ungrouped 28173 1726882746.31257: Examining possible inventory source: /tmp/network-91m/inventory.yml 28173 1726882746.46167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 28173 1726882746.46233: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 28173 1726882746.46260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 28173 1726882746.46323: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 28173 1726882746.46409: Loaded config def from plugin (inventory/script) 28173 1726882746.46411: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 28173 1726882746.46461: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 28173 1726882746.46551: Loaded config def from plugin (inventory/yaml) 28173 1726882746.46557: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 28173 1726882746.46648: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 28173 1726882746.47122: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 28173 1726882746.47126: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 28173 1726882746.47129: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 28173 1726882746.47134: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 28173 1726882746.47139: Loading data from /tmp/network-91m/inventory.yml 28173 1726882746.47214: /tmp/network-91m/inventory.yml was not parsable by auto 28173 1726882746.47285: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 28173 1726882746.47334: Loading data from /tmp/network-91m/inventory.yml 28173 1726882746.47418: group all already in inventory 28173 1726882746.47425: set inventory_file for managed_node1 28173 1726882746.47434: set inventory_dir for managed_node1 28173 1726882746.47435: Added host managed_node1 to inventory 28173 1726882746.47442: Added host managed_node1 to group all 28173 1726882746.47443: set ansible_host for managed_node1 28173 1726882746.47444: set ansible_ssh_extra_args for managed_node1 28173 1726882746.47448: set inventory_file for managed_node2 28173 1726882746.47452: set inventory_dir for managed_node2 28173 1726882746.47452: Added host managed_node2 to inventory 28173 1726882746.47454: Added host managed_node2 to group all 28173 1726882746.47455: set ansible_host for managed_node2 28173 1726882746.47456: set ansible_ssh_extra_args for managed_node2 28173 1726882746.47458: set inventory_file for managed_node3 28173 1726882746.47461: set inventory_dir for managed_node3 28173 1726882746.47462: Added host managed_node3 to inventory 28173 1726882746.47463: Added host managed_node3 to group all 28173 1726882746.47466: set ansible_host for managed_node3 28173 1726882746.47467: set ansible_ssh_extra_args for managed_node3 28173 1726882746.47470: Reconcile groups and hosts in inventory. 28173 1726882746.47474: Group ungrouped now contains managed_node1 28173 1726882746.47477: Group ungrouped now contains managed_node2 28173 1726882746.47478: Group ungrouped now contains managed_node3 28173 1726882746.47570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 28173 1726882746.47703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 28173 1726882746.47751: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 28173 1726882746.47789: Loaded config def from plugin (vars/host_group_vars) 28173 1726882746.47791: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 28173 1726882746.47798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 28173 1726882746.47806: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 28173 1726882746.47848: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 28173 1726882746.48206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882746.48308: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 28173 1726882746.48351: Loaded config def from plugin (connection/local) 28173 1726882746.48355: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 28173 1726882746.49005: Loaded config def from plugin (connection/paramiko_ssh) 28173 1726882746.49008: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 28173 1726882746.50013: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28173 1726882746.50061: Loaded config def from plugin (connection/psrp) 28173 1726882746.50066: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 28173 1726882746.50823: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28173 1726882746.50863: Loaded config def from plugin (connection/ssh) 28173 1726882746.50868: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 28173 1726882746.52858: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28173 1726882746.52906: Loaded config def from plugin (connection/winrm) 28173 1726882746.52909: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 28173 1726882746.52939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 28173 1726882746.53011: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 28173 1726882746.53088: Loaded config def from plugin (shell/cmd) 28173 1726882746.53093: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 28173 1726882746.53118: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 28173 1726882746.53191: Loaded config def from plugin (shell/powershell) 28173 1726882746.53193: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 28173 1726882746.53248: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 28173 1726882746.53448: Loaded config def from plugin (shell/sh) 28173 1726882746.53451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 28173 1726882746.53489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 28173 1726882746.53768: Loaded config def from plugin (become/runas) 28173 1726882746.53770: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 28173 1726882746.53977: Loaded config def from plugin (become/su) 28173 1726882746.53980: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 28173 1726882746.54151: Loaded config def from plugin (become/sudo) 28173 1726882746.54153: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 28173 1726882746.54194: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml 28173 1726882746.54539: in VariableManager get_vars() 28173 1726882746.54559: done with get_vars() 28173 1726882746.54696: trying /usr/local/lib/python3.12/site-packages/ansible/modules 28173 1726882746.58793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 28173 1726882746.58917: in VariableManager get_vars() 28173 1726882746.58922: done with get_vars() 28173 1726882746.58925: variable 'playbook_dir' from source: magic vars 28173 1726882746.58926: variable 'ansible_playbook_python' from source: magic vars 28173 1726882746.58927: variable 'ansible_config_file' from source: magic vars 28173 1726882746.58927: variable 'groups' from source: magic vars 28173 1726882746.58928: variable 'omit' from source: magic vars 28173 1726882746.58929: variable 'ansible_version' from source: magic vars 28173 1726882746.58930: variable 'ansible_check_mode' from source: magic vars 28173 1726882746.58930: variable 'ansible_diff_mode' from source: magic vars 28173 1726882746.58931: variable 'ansible_forks' from source: magic vars 28173 1726882746.58932: variable 'ansible_inventory_sources' from source: magic vars 28173 1726882746.58933: variable 'ansible_skip_tags' from source: magic vars 28173 1726882746.58933: variable 'ansible_limit' from source: magic vars 28173 1726882746.58934: variable 'ansible_run_tags' from source: magic vars 28173 1726882746.58935: variable 'ansible_verbosity' from source: magic vars 28173 1726882746.58982: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml 28173 1726882746.59922: in VariableManager get_vars() 28173 1726882746.59939: done with get_vars() 28173 1726882746.60013: in VariableManager get_vars() 28173 1726882746.60809: done with get_vars() 28173 1726882746.60846: in VariableManager get_vars() 28173 1726882746.60858: done with get_vars() 28173 1726882746.60909: in VariableManager get_vars() 28173 1726882746.61038: done with get_vars() 28173 1726882746.61043: variable 'omit' from source: magic vars 28173 1726882746.61062: variable 'omit' from source: magic vars 28173 1726882746.61100: in VariableManager get_vars() 28173 1726882746.61111: done with get_vars() 28173 1726882746.61276: in VariableManager get_vars() 28173 1726882746.61289: done with get_vars() 28173 1726882746.61325: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28173 1726882746.61896: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28173 1726882746.62150: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28173 1726882746.63648: in VariableManager get_vars() 28173 1726882746.63675: done with get_vars() 28173 1726882746.64591: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 28173 1726882746.64968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28173 1726882746.68745: in VariableManager get_vars() 28173 1726882746.68793: done with get_vars() 28173 1726882746.68799: variable 'omit' from source: magic vars 28173 1726882746.68811: variable 'omit' from source: magic vars 28173 1726882746.68853: in VariableManager get_vars() 28173 1726882746.68873: done with get_vars() 28173 1726882746.68902: in VariableManager get_vars() 28173 1726882746.68936: done with get_vars() 28173 1726882746.68972: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28173 1726882746.69105: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28173 1726882746.69190: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28173 1726882746.69637: in VariableManager get_vars() 28173 1726882746.69669: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28173 1726882746.72082: in VariableManager get_vars() 28173 1726882746.72085: done with get_vars() 28173 1726882746.72088: variable 'playbook_dir' from source: magic vars 28173 1726882746.72089: variable 'ansible_playbook_python' from source: magic vars 28173 1726882746.72090: variable 'ansible_config_file' from source: magic vars 28173 1726882746.72091: variable 'groups' from source: magic vars 28173 1726882746.72091: variable 'omit' from source: magic vars 28173 1726882746.72092: variable 'ansible_version' from source: magic vars 28173 1726882746.72093: variable 'ansible_check_mode' from source: magic vars 28173 1726882746.72094: variable 'ansible_diff_mode' from source: magic vars 28173 1726882746.72095: variable 'ansible_forks' from source: magic vars 28173 1726882746.72095: variable 'ansible_inventory_sources' from source: magic vars 28173 1726882746.72096: variable 'ansible_skip_tags' from source: magic vars 28173 1726882746.72097: variable 'ansible_limit' from source: magic vars 28173 1726882746.72098: variable 'ansible_run_tags' from source: magic vars 28173 1726882746.72098: variable 'ansible_verbosity' from source: magic vars 28173 1726882746.72133: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 28173 1726882746.72218: in VariableManager get_vars() 28173 1726882746.72222: done with get_vars() 28173 1726882746.72224: variable 'playbook_dir' from source: magic vars 28173 1726882746.72225: variable 'ansible_playbook_python' from source: magic vars 28173 1726882746.72226: variable 'ansible_config_file' from source: magic vars 28173 1726882746.72227: variable 'groups' from source: magic vars 28173 1726882746.72227: variable 'omit' from source: magic vars 28173 1726882746.72228: variable 'ansible_version' from source: magic vars 28173 1726882746.72229: variable 'ansible_check_mode' from source: magic vars 28173 1726882746.72230: variable 'ansible_diff_mode' from source: magic vars 28173 1726882746.72231: variable 'ansible_forks' from source: magic vars 28173 1726882746.72231: variable 'ansible_inventory_sources' from source: magic vars 28173 1726882746.72237: variable 'ansible_skip_tags' from source: magic vars 28173 1726882746.72238: variable 'ansible_limit' from source: magic vars 28173 1726882746.72239: variable 'ansible_run_tags' from source: magic vars 28173 1726882746.72239: variable 'ansible_verbosity' from source: magic vars 28173 1726882746.72280: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 28173 1726882746.72444: in VariableManager get_vars() 28173 1726882746.72455: done with get_vars() 28173 1726882746.72506: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28173 1726882746.72626: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28173 1726882746.72715: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28173 1726882746.74805: in VariableManager get_vars() 28173 1726882746.74825: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28173 1726882746.78452: in VariableManager get_vars() 28173 1726882746.78471: done with get_vars() 28173 1726882746.78509: in VariableManager get_vars() 28173 1726882746.78513: done with get_vars() 28173 1726882746.78515: variable 'playbook_dir' from source: magic vars 28173 1726882746.78516: variable 'ansible_playbook_python' from source: magic vars 28173 1726882746.78516: variable 'ansible_config_file' from source: magic vars 28173 1726882746.78517: variable 'groups' from source: magic vars 28173 1726882746.78518: variable 'omit' from source: magic vars 28173 1726882746.78519: variable 'ansible_version' from source: magic vars 28173 1726882746.78520: variable 'ansible_check_mode' from source: magic vars 28173 1726882746.78520: variable 'ansible_diff_mode' from source: magic vars 28173 1726882746.78521: variable 'ansible_forks' from source: magic vars 28173 1726882746.78522: variable 'ansible_inventory_sources' from source: magic vars 28173 1726882746.78522: variable 'ansible_skip_tags' from source: magic vars 28173 1726882746.78523: variable 'ansible_limit' from source: magic vars 28173 1726882746.78524: variable 'ansible_run_tags' from source: magic vars 28173 1726882746.78524: variable 'ansible_verbosity' from source: magic vars 28173 1726882746.78568: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 28173 1726882746.78638: in VariableManager get_vars() 28173 1726882746.78649: done with get_vars() 28173 1726882746.78691: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28173 1726882746.78810: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28173 1726882746.78884: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28173 1726882746.79234: in VariableManager get_vars() 28173 1726882746.79252: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28173 1726882746.81099: in VariableManager get_vars() 28173 1726882746.81113: done with get_vars() 28173 1726882746.81149: in VariableManager get_vars() 28173 1726882746.81161: done with get_vars() 28173 1726882746.81199: in VariableManager get_vars() 28173 1726882746.81210: done with get_vars() 28173 1726882746.81277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 28173 1726882746.81291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 28173 1726882746.81535: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 28173 1726882746.81694: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 28173 1726882746.81697: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 28173 1726882746.81726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 28173 1726882746.81751: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 28173 1726882746.81923: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 28173 1726882746.82005: Loaded config def from plugin (callback/default) 28173 1726882746.82007: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28173 1726882746.83296: Loaded config def from plugin (callback/junit) 28173 1726882746.83299: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28173 1726882746.83342: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 28173 1726882746.83411: Loaded config def from plugin (callback/minimal) 28173 1726882746.83413: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28173 1726882746.83452: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28173 1726882746.83515: Loaded config def from plugin (callback/tree) 28173 1726882746.83518: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 28173 1726882746.83635: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 28173 1726882746.83638: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_route_table_nm.yml ********************************************* 6 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml 28173 1726882746.83670: in VariableManager get_vars() 28173 1726882746.83681: done with get_vars() 28173 1726882746.83687: in VariableManager get_vars() 28173 1726882746.83695: done with get_vars() 28173 1726882746.83699: variable 'omit' from source: magic vars 28173 1726882746.83734: in VariableManager get_vars() 28173 1726882746.83746: done with get_vars() 28173 1726882746.83769: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_route_table.yml' with nm as provider] ****** 28173 1726882746.84307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 28173 1726882746.84418: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 28173 1726882746.84446: getting the remaining hosts for this loop 28173 1726882746.84447: done getting the remaining hosts for this loop 28173 1726882746.84450: getting the next task for host managed_node2 28173 1726882746.84453: done getting next task for host managed_node2 28173 1726882746.84455: ^ task is: TASK: Gathering Facts 28173 1726882746.84456: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882746.84458: getting variables 28173 1726882746.84460: in VariableManager get_vars() 28173 1726882746.84470: Calling all_inventory to load vars for managed_node2 28173 1726882746.84472: Calling groups_inventory to load vars for managed_node2 28173 1726882746.84475: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882746.84486: Calling all_plugins_play to load vars for managed_node2 28173 1726882746.84497: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882746.84501: Calling groups_plugins_play to load vars for managed_node2 28173 1726882746.84533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882746.84585: done with get_vars() 28173 1726882746.84591: done getting variables 28173 1726882746.84670: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:6 Friday 20 September 2024 21:39:06 -0400 (0:00:00.011) 0:00:00.011 ****** 28173 1726882746.84690: entering _queue_task() for managed_node2/gather_facts 28173 1726882746.84691: Creating lock for gather_facts 28173 1726882746.85047: worker is 1 (out of 1 available) 28173 1726882746.85057: exiting _queue_task() for managed_node2/gather_facts 28173 1726882746.85073: done queuing things up, now waiting for results queue to drain 28173 1726882746.85075: waiting for pending results... 28173 1726882746.85316: running TaskExecutor() for managed_node2/TASK: Gathering Facts 28173 1726882746.85414: in run() - task 0e448fcc-3ce9-926c-8928-0000000000f5 28173 1726882746.85445: variable 'ansible_search_path' from source: unknown 28173 1726882746.85486: calling self._execute() 28173 1726882746.85555: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882746.85566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882746.85581: variable 'omit' from source: magic vars 28173 1726882746.85705: variable 'omit' from source: magic vars 28173 1726882746.85739: variable 'omit' from source: magic vars 28173 1726882746.85791: variable 'omit' from source: magic vars 28173 1726882746.85835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882746.85884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882746.85905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882746.85927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882746.85942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882746.85978: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882746.85993: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882746.86000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882746.86106: Set connection var ansible_pipelining to False 28173 1726882746.86114: Set connection var ansible_shell_type to sh 28173 1726882746.86125: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882746.86135: Set connection var ansible_timeout to 10 28173 1726882746.86143: Set connection var ansible_shell_executable to /bin/sh 28173 1726882746.86151: Set connection var ansible_connection to ssh 28173 1726882746.86179: variable 'ansible_shell_executable' from source: unknown 28173 1726882746.86186: variable 'ansible_connection' from source: unknown 28173 1726882746.86192: variable 'ansible_module_compression' from source: unknown 28173 1726882746.86208: variable 'ansible_shell_type' from source: unknown 28173 1726882746.86215: variable 'ansible_shell_executable' from source: unknown 28173 1726882746.86222: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882746.86229: variable 'ansible_pipelining' from source: unknown 28173 1726882746.86235: variable 'ansible_timeout' from source: unknown 28173 1726882746.86242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882746.86429: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882746.86445: variable 'omit' from source: magic vars 28173 1726882746.86454: starting attempt loop 28173 1726882746.86460: running the handler 28173 1726882746.86483: variable 'ansible_facts' from source: unknown 28173 1726882746.86515: _low_level_execute_command(): starting 28173 1726882746.86537: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882746.87923: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882746.87936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882746.87959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882746.87981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882746.88019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882746.88031: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882746.88044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882746.88072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882746.88086: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882746.88097: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882746.88108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882746.88121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882746.88134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882746.88145: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882746.88158: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882746.88179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882746.88255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882746.88280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882746.88297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882746.88431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882746.90101: stdout chunk (state=3): >>>/root <<< 28173 1726882746.90288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882746.90291: stdout chunk (state=3): >>><<< 28173 1726882746.90294: stderr chunk (state=3): >>><<< 28173 1726882746.90400: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882746.90403: _low_level_execute_command(): starting 28173 1726882746.90406: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882746.9031072-28195-41565633484875 `" && echo ansible-tmp-1726882746.9031072-28195-41565633484875="` echo /root/.ansible/tmp/ansible-tmp-1726882746.9031072-28195-41565633484875 `" ) && sleep 0' 28173 1726882746.91436: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882746.91445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882746.91481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882746.91485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882746.91487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882746.91647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882746.91666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882746.91700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882746.91845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882746.93728: stdout chunk (state=3): >>>ansible-tmp-1726882746.9031072-28195-41565633484875=/root/.ansible/tmp/ansible-tmp-1726882746.9031072-28195-41565633484875 <<< 28173 1726882746.93912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882746.93915: stdout chunk (state=3): >>><<< 28173 1726882746.93917: stderr chunk (state=3): >>><<< 28173 1726882746.94169: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882746.9031072-28195-41565633484875=/root/.ansible/tmp/ansible-tmp-1726882746.9031072-28195-41565633484875 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882746.94173: variable 'ansible_module_compression' from source: unknown 28173 1726882746.94175: ANSIBALLZ: Using generic lock for ansible.legacy.setup 28173 1726882746.94177: ANSIBALLZ: Acquiring lock 28173 1726882746.94179: ANSIBALLZ: Lock acquired: 140243978110592 28173 1726882746.94181: ANSIBALLZ: Creating module 28173 1726882747.26912: ANSIBALLZ: Writing module into payload 28173 1726882747.27086: ANSIBALLZ: Writing module 28173 1726882747.27134: ANSIBALLZ: Renaming module 28173 1726882747.27145: ANSIBALLZ: Done creating module 28173 1726882747.27191: variable 'ansible_facts' from source: unknown 28173 1726882747.27203: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882747.27217: _low_level_execute_command(): starting 28173 1726882747.27237: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 28173 1726882747.28298: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882747.28313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882747.28329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882747.28348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882747.28401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882747.28413: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882747.28427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882747.28447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882747.28464: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882747.28485: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882747.28498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882747.28511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882747.28525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882747.28537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882747.28547: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882747.28561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882747.28647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882747.28673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882747.28699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882747.28842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882747.30518: stdout chunk (state=3): >>>PLATFORM <<< 28173 1726882747.30617: stdout chunk (state=3): >>>Linux <<< 28173 1726882747.30632: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 28173 1726882747.30775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882747.30852: stderr chunk (state=3): >>><<< 28173 1726882747.30872: stdout chunk (state=3): >>><<< 28173 1726882747.31024: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882747.31035 [managed_node2]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 28173 1726882747.31038: _low_level_execute_command(): starting 28173 1726882747.31040: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 28173 1726882747.31102: Sending initial data 28173 1726882747.31105: Sent initial data (1181 bytes) 28173 1726882747.31840: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882747.31872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882747.31907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882747.32727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882747.32768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882747.32780: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882747.32792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882747.32807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882747.32816: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882747.32832: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882747.32842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882747.32852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882747.32868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882747.32878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882747.32887: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882747.32897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882747.33092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882747.33112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882747.33126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882747.33253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882747.37855: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 28173 1726882747.38357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882747.38577: stderr chunk (state=3): >>><<< 28173 1726882747.38580: stdout chunk (state=3): >>><<< 28173 1726882747.38582: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882747.38585: variable 'ansible_facts' from source: unknown 28173 1726882747.38586: variable 'ansible_facts' from source: unknown 28173 1726882747.38588: variable 'ansible_module_compression' from source: unknown 28173 1726882747.38590: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28173 1726882747.38772: variable 'ansible_facts' from source: unknown 28173 1726882747.38916: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882746.9031072-28195-41565633484875/AnsiballZ_setup.py 28173 1726882747.39131: Sending initial data 28173 1726882747.39134: Sent initial data (153 bytes) 28173 1726882747.40082: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882747.40085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882747.40117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882747.40121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882747.40123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882747.40181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882747.40192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882747.40301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28173 1726882747.42366: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882747.42459: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882747.42569: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpl4383dr9 /root/.ansible/tmp/ansible-tmp-1726882746.9031072-28195-41565633484875/AnsiballZ_setup.py <<< 28173 1726882747.42671: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882747.45466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882747.45653: stderr chunk (state=3): >>><<< 28173 1726882747.45668: stdout chunk (state=3): >>><<< 28173 1726882747.45758: done transferring module to remote 28173 1726882747.45761: _low_level_execute_command(): starting 28173 1726882747.45766: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882746.9031072-28195-41565633484875/ /root/.ansible/tmp/ansible-tmp-1726882746.9031072-28195-41565633484875/AnsiballZ_setup.py && sleep 0' 28173 1726882747.47318: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882747.47333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882747.47347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882747.47366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882747.47419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882747.47431: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882747.47445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882747.47462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882747.47478: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882747.47490: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882747.47514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882747.47529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882747.47545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882747.47558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882747.47574: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882747.47587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882747.47677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882747.47698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882747.47718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882747.47861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882747.49781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882747.49784: stdout chunk (state=3): >>><<< 28173 1726882747.49786: stderr chunk (state=3): >>><<< 28173 1726882747.49878: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882747.49882: _low_level_execute_command(): starting 28173 1726882747.49884: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882746.9031072-28195-41565633484875/AnsiballZ_setup.py && sleep 0' 28173 1726882747.50548: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882747.50562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882747.50580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882747.50597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882747.50648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882747.50659: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882747.50674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882747.50692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882747.50704: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882747.50716: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882747.50739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882747.50753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882747.50773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882747.50786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882747.50798: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882747.50812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882747.50905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882747.50926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882747.50953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882747.51099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882747.53118: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 28173 1726882747.53121: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 28173 1726882747.53196: stdout chunk (state=3): >>>import '_io' # <<< 28173 1726882747.53199: stdout chunk (state=3): >>>import 'marshal' # <<< 28173 1726882747.53228: stdout chunk (state=3): >>>import 'posix' # <<< 28173 1726882747.53269: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 28173 1726882747.53294: stdout chunk (state=3): >>># installing zipimport hook <<< 28173 1726882747.53308: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 28173 1726882747.53363: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882747.53411: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 28173 1726882747.53424: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b51edc0> <<< 28173 1726882747.53471: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 28173 1726882747.53501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b4c33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b51eb20> <<< 28173 1726882747.53553: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b51eac0> <<< 28173 1726882747.53568: stdout chunk (state=3): >>>import '_signal' # <<< 28173 1726882747.53594: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b4c3490> <<< 28173 1726882747.53653: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 28173 1726882747.53657: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b4c3940> <<< 28173 1726882747.53672: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b4c3670> <<< 28173 1726882747.53700: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 28173 1726882747.53739: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 28173 1726882747.53770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 28173 1726882747.53794: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 28173 1726882747.53829: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b47a190> <<< 28173 1726882747.53832: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 28173 1726882747.53850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 28173 1726882747.53914: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b47a220> <<< 28173 1726882747.53973: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 28173 1726882747.53994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 28173 1726882747.53997: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b49d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b47a940> <<< 28173 1726882747.54024: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b4db880> <<< 28173 1726882747.54037: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b473d90> <<< 28173 1726882747.54104: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b49dd90> <<< 28173 1726882747.54148: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b4c3970> <<< 28173 1726882747.54182: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28173 1726882747.54519: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 28173 1726882747.54555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 28173 1726882747.54578: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 28173 1726882747.54603: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 28173 1726882747.54626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 28173 1726882747.54633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1d3eb0> <<< 28173 1726882747.54674: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1d5f40> <<< 28173 1726882747.54710: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 28173 1726882747.54722: stdout chunk (state=3): >>>import '_sre' # <<< 28173 1726882747.54756: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 28173 1726882747.54780: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 28173 1726882747.54821: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1cc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1d2640> <<< 28173 1726882747.54828: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1d3370> <<< 28173 1726882747.54844: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 28173 1726882747.54914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 28173 1726882747.54925: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 28173 1726882747.54972: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882747.54998: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 28173 1726882747.55024: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191b08fdc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b08f8b0> import 'itertools' # <<< 28173 1726882747.55047: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' <<< 28173 1726882747.55086: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b08feb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 28173 1726882747.55109: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b08ff70> <<< 28173 1726882747.55150: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b08fe80> <<< 28173 1726882747.55161: stdout chunk (state=3): >>>import '_collections' # <<< 28173 1726882747.55208: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1aed30> import '_functools' # <<< 28173 1726882747.55240: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1a7610> <<< 28173 1726882747.55311: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1bb670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1dae20> <<< 28173 1726882747.55345: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 28173 1726882747.55371: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191b0a1c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1ae250> <<< 28173 1726882747.55417: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.55423: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191b1bb280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1e09d0> <<< 28173 1726882747.55457: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882747.55497: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 28173 1726882747.55521: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a1fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a1d90> <<< 28173 1726882747.55565: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a1d00> <<< 28173 1726882747.55598: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 28173 1726882747.55601: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 28173 1726882747.55612: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 28173 1726882747.55662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 28173 1726882747.55709: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b074370> <<< 28173 1726882747.55712: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 28173 1726882747.55724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 28173 1726882747.55748: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b074460> <<< 28173 1726882747.55876: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a9fa0> <<< 28173 1726882747.55922: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a3a30> <<< 28173 1726882747.55925: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a3490> <<< 28173 1726882747.55951: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 28173 1726882747.55957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 28173 1726882747.55985: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 28173 1726882747.56013: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 28173 1726882747.56021: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afc21c0> <<< 28173 1726882747.56049: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b05fc70> <<< 28173 1726882747.56115: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a3eb0> <<< 28173 1726882747.56119: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1e0040> <<< 28173 1726882747.56130: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 28173 1726882747.56158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 28173 1726882747.56758: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afd4af0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191afd4e20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afe6730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afe6c70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191af7e3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afd4f10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191af8f280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afe65b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191af8f340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a19d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191afaa6a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191afaa970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afaa760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191afaa850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 28173 1726882747.56999: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191afaaca0> <<< 28173 1726882747.57039: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191afb71f0> <<< 28173 1726882747.57049: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afaa8e0> <<< 28173 1726882747.57059: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191af9ea30> <<< 28173 1726882747.57090: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a15b0> <<< 28173 1726882747.57113: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 28173 1726882747.57192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 28173 1726882747.57232: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afaaa90> <<< 28173 1726882747.57634: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f191a9e7670> <<< 28173 1726882747.57985: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 28173 1726882747.58013: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.58054: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 28173 1726882747.58085: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882747.58088: stdout chunk (state=3): >>> <<< 28173 1726882747.58112: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.58144: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 28173 1726882747.58183: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882747.58187: stdout chunk (state=3): >>> <<< 28173 1726882747.60107: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.61326: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 28173 1726882747.61332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a925790> <<< 28173 1726882747.61357: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882747.61383: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 28173 1726882747.61389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 28173 1726882747.61413: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 28173 1726882747.61444: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.61448: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a925160> <<< 28173 1726882747.61506: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a925280> <<< 28173 1726882747.61542: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a925ee0> <<< 28173 1726882747.61567: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 28173 1726882747.61571: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 28173 1726882747.61627: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a925fd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a925d00> <<< 28173 1726882747.61635: stdout chunk (state=3): >>>import 'atexit' # <<< 28173 1726882747.61666: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.61672: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a925f40> <<< 28173 1726882747.61684: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 28173 1726882747.61728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 28173 1726882747.62385: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a925100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a8fb160> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a7fe0a0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a7fe280> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a7fec10> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a90cdc0> <<< 28173 1726882747.62453: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a90c3d0> <<< 28173 1726882747.62491: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 28173 1726882747.62512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 28173 1726882747.62551: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a90cf40> <<< 28173 1726882747.62590: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py<<< 28173 1726882747.62593: stdout chunk (state=3): >>> <<< 28173 1726882747.62620: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc'<<< 28173 1726882747.62624: stdout chunk (state=3): >>> <<< 28173 1726882747.62666: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py<<< 28173 1726882747.62678: stdout chunk (state=3): >>> <<< 28173 1726882747.62687: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 28173 1726882747.62722: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 28173 1726882747.62751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 28173 1726882747.62794: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py<<< 28173 1726882747.62799: stdout chunk (state=3): >>> <<< 28173 1726882747.62817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 28173 1726882747.62839: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a95ac10> <<< 28173 1726882747.62955: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a928ca0> <<< 28173 1726882747.62977: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a928370> <<< 28173 1726882747.62999: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a8d9bb0> <<< 28173 1726882747.63041: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.63060: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.63086: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a928490> <<< 28173 1726882747.63123: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py <<< 28173 1726882747.63151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc'<<< 28173 1726882747.63154: stdout chunk (state=3): >>> <<< 28173 1726882747.63160: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a9284c0> <<< 28173 1726882747.63200: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 28173 1726882747.63231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc'<<< 28173 1726882747.63235: stdout chunk (state=3): >>> <<< 28173 1726882747.63269: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 28173 1726882747.63334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc'<<< 28173 1726882747.63339: stdout chunk (state=3): >>> <<< 28173 1726882747.63431: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.63457: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so'<<< 28173 1726882747.63461: stdout chunk (state=3): >>> <<< 28173 1726882747.63473: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a86c040> <<< 28173 1726882747.63489: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a9058e0> <<< 28173 1726882747.63523: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 28173 1726882747.63555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 28173 1726882747.63635: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.63655: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.63668: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a8688e0> <<< 28173 1726882747.63688: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a905160> <<< 28173 1726882747.63724: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py<<< 28173 1726882747.63730: stdout chunk (state=3): >>> <<< 28173 1726882747.63796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc'<<< 28173 1726882747.63802: stdout chunk (state=3): >>> <<< 28173 1726882747.63832: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py<<< 28173 1726882747.63839: stdout chunk (state=3): >>> <<< 28173 1726882747.63854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 28173 1726882747.63881: stdout chunk (state=3): >>>import '_string' # <<< 28173 1726882747.63982: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a905ca0> <<< 28173 1726882747.64212: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a868880> <<< 28173 1726882747.64350: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.64370: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.64383: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a930460> <<< 28173 1726882747.64426: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.64454: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.64466: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a905a30> <<< 28173 1726882747.64527: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.64554: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.64568: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a9053a0> <<< 28173 1726882747.64588: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a96c6d0> <<< 28173 1726882747.64623: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py<<< 28173 1726882747.64630: stdout chunk (state=3): >>> <<< 28173 1726882747.64646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 28173 1726882747.64683: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 28173 1726882747.64720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc'<<< 28173 1726882747.64725: stdout chunk (state=3): >>> <<< 28173 1726882747.64786: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.64807: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.64823: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a87bd90> <<< 28173 1726882747.65124: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.65157: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.65160: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a8ed9d0> <<< 28173 1726882747.65186: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a85d610> <<< 28173 1726882747.65228: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.65250: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.65266: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a87b100> <<< 28173 1726882747.65283: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a85da00> <<< 28173 1726882747.65309: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882747.65315: stdout chunk (state=3): >>> <<< 28173 1726882747.65337: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.65361: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 28173 1726882747.65398: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.65522: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.65645: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.65667: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882747.65681: stdout chunk (state=3): >>> <<< 28173 1726882747.65693: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 28173 1726882747.65716: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.65740: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.65762: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py<<< 28173 1726882747.65770: stdout chunk (state=3): >>> <<< 28173 1726882747.65796: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.65967: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.66121: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.66730: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.67202: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 28173 1726882747.67206: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 28173 1726882747.67227: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 28173 1726882747.67242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882747.67292: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a8798e0> <<< 28173 1726882747.67374: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 28173 1726882747.67377: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a8a6940> <<< 28173 1726882747.67388: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3ed970> <<< 28173 1726882747.67422: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 28173 1726882747.67453: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.67458: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.67477: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 28173 1726882747.67608: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.67723: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 28173 1726882747.67728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 28173 1726882747.67741: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a8e3790> <<< 28173 1726882747.67764: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.68341: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.69015: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.69050: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.69143: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 28173 1726882747.69148: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.69193: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.69235: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 28173 1726882747.69241: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.69324: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.69432: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 28173 1726882747.69444: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.69452: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 28173 1726882747.69484: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.69521: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.69562: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 28173 1726882747.69581: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.69872: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.70173: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 28173 1726882747.70207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 28173 1726882747.70221: stdout chunk (state=3): >>>import '_ast' # <<< 28173 1726882747.70328: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3f1d90> <<< 28173 1726882747.70334: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.70417: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.70502: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 28173 1726882747.70508: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py <<< 28173 1726882747.70533: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 28173 1726882747.70542: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.70587: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.70639: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 28173 1726882747.70644: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.70695: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.70737: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.70840: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.70937: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882747.71009: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a8970a0> <<< 28173 1726882747.71128: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3d3430> <<< 28173 1726882747.71457: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 28173 1726882747.71777: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a8a0160> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a89ccd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3f1bb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 28173 1726882747.72052: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 28173 1726882747.72076: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 28173 1726882747.72154: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 28173 1726882747.72165: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available <<< 28173 1726882747.72224: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.72305: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.72329: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.72356: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py <<< 28173 1726882747.72366: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.72535: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.72690: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.72715: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.72789: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882747.72823: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 28173 1726882747.72830: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 28173 1726882747.72858: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a16ca60> <<< 28173 1726882747.72899: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 28173 1726882747.72934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 28173 1726882747.72938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 28173 1726882747.72968: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3cb6d0> <<< 28173 1726882747.73023: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a3cbaf0> <<< 28173 1726882747.73089: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3b2250> <<< 28173 1726882747.73106: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3b2a30> <<< 28173 1726882747.73146: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a401460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a401910> <<< 28173 1726882747.73172: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 28173 1726882747.73175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 28173 1726882747.73188: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 28173 1726882747.73248: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a3fdd00> <<< 28173 1726882747.73261: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3fdd60> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 28173 1726882747.73299: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3fd250> <<< 28173 1726882747.73319: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 28173 1726882747.73378: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a1d4f70> <<< 28173 1726882747.73437: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a4164c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a401310> <<< 28173 1726882747.73442: stdout chunk (state=3): >>>import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 28173 1726882747.73478: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available <<< 28173 1726882747.73512: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.73584: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 28173 1726882747.73615: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.74073: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 28173 1726882747.74236: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py <<< 28173 1726882747.74241: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 28173 1726882747.75095: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.75577: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 28173 1726882747.75593: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.75685: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.75756: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.75811: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.75859: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 28173 1726882747.75878: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py<<< 28173 1726882747.75908: stdout chunk (state=3): >>> # zipimport: zlib available <<< 28173 1726882747.75952: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.76015: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py<<< 28173 1726882747.76032: stdout chunk (state=3): >>> # zipimport: zlib available <<< 28173 1726882747.76124: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882747.76135: stdout chunk (state=3): >>> <<< 28173 1726882747.76193: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py<<< 28173 1726882747.76218: stdout chunk (state=3): >>> <<< 28173 1726882747.76254: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.76284: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.76386: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 28173 1726882747.76460: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.76615: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a0e3ca0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 28173 1726882747.76755: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a0e3fd0> <<< 28173 1726882747.76771: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 28173 1726882747.76828: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.76883: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 28173 1726882747.76895: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.76963: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.77055: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 28173 1726882747.77105: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.77188: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 28173 1726882747.77202: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.77213: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.77283: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 28173 1726882747.77289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 28173 1726882747.77433: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.77439: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a0e0b20> <<< 28173 1726882747.77885: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a12f4f0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 28173 1726882747.77979: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 28173 1726882747.77985: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.78086: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.78196: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.78345: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.78556: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 28173 1726882747.78568: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.78610: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.78671: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 28173 1726882747.78675: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.78721: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.78785: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py <<< 28173 1726882747.78791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 28173 1726882747.78845: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a05e190> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a05e2e0> <<< 28173 1726882747.78852: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 28173 1726882747.78854: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.78880: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 28173 1726882747.78900: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.78947: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.79001: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 28173 1726882747.79006: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.79215: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.79679: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 28173 1726882747.79721: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.79775: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 28173 1726882747.79788: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 28173 1726882747.79804: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.79941: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.79972: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.80173: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.80357: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 28173 1726882747.80366: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 28173 1726882747.80380: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.80547: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.80709: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 28173 1726882747.80730: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.80772: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.80818: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.81546: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.82226: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py <<< 28173 1726882747.82252: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 28173 1726882747.82268: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.82397: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.82532: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 28173 1726882747.82539: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.82673: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.82795: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 28173 1726882747.82817: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.83015: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.83214: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 28173 1726882747.83232: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.83257: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.83263: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 28173 1726882747.83291: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.83344: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.83411: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 28173 1726882747.83416: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.83548: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.83676: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.83952: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.84220: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py <<< 28173 1726882747.84245: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 28173 1726882747.84248: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.84295: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.84353: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 28173 1726882747.84367: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.84397: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.84424: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 28173 1726882747.84440: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.84530: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.84618: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 28173 1726882747.84640: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.84666: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.84708: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 28173 1726882747.84719: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.84794: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.84865: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 28173 1726882747.84894: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.84945: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.85009: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 28173 1726882747.85359: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.85718: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 28173 1726882747.85721: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.85791: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.85852: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 28173 1726882747.85898: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.85946: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 28173 1726882747.85950: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.85988: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86023: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 28173 1726882747.86027: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86070: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86105: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 28173 1726882747.86108: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86205: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86306: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 28173 1726882747.86326: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 28173 1726882747.86344: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86394: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86451: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 28173 1726882747.86482: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86495: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86558: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86611: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86702: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86786: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 28173 1726882747.86792: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 28173 1726882747.86809: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86861: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.86925: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 28173 1726882747.86930: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.87184: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.87434: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 28173 1726882747.87440: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.87491: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.87549: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 28173 1726882747.87554: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.87610: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.87666: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 28173 1726882747.87676: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.87773: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.87875: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 28173 1726882747.87890: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 28173 1726882747.87892: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.87993: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.88104: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py <<< 28173 1726882747.88109: stdout chunk (state=3): >>>import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 28173 1726882747.88214: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882747.89252: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 28173 1726882747.89257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 28173 1726882747.89289: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 28173 1726882747.89299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 28173 1726882747.89340: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.89362: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882747.89371: stdout chunk (state=3): >>>import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1919eded60> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1919edefa0> <<< 28173 1726882747.89440: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1919ea4cd0> <<< 28173 1726882747.91091: stdout chunk (state=3): >>>import 'gc' # <<< 28173 1726882747.94101: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 28173 1726882747.94124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 28173 1726882747.94139: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1919ede910> <<< 28173 1726882747.94176: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 28173 1726882747.94205: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 28173 1726882747.94230: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a05b8b0> <<< 28173 1726882747.94320: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py <<< 28173 1726882747.94327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882747.94364: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py <<< 28173 1726882747.94368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' <<< 28173 1726882747.94381: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a0b7760> <<< 28173 1726882747.94388: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a05ba90> <<< 28173 1726882747.94812: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 28173 1726882747.94844: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 28173 1726882747.94847: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 28173 1726882747.94849: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 28173 1726882748.19919: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZD<<< 28173 1726882748.19976: stdout chunk (state=3): >>>RoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2807, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 725, "free": 2807}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []<<< 28173 1726882748.19996: stdout chunk (state=3): >>>}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 686, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238256128, "block_size": 4096, "block_total": 65519355, "block_available": 64511293, "block_used": 1008062, "inode_total": 131071472, "inode_available": 130998689, "inode_used": 72783, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [<<< 28173 1726882748.20012: stdout chunk (state=3): >>>fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "08", "epoch": "1726882748", "epoch_int": "1726882748", "date": "2024-09-20", "time": "21:39:08", "iso8601_micro": "2024-09-21T01:39:08.194259Z", "iso8601": "2024-09-21T01:39:08Z", "iso8601_basic": "20240920T213908194259", "iso8601_basic_short": "20240920T213908", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.43, "5m": 0.41, "15m": 0.24}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28173 1726882748.20548: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io <<< 28173 1726882748.20645: stdout chunk (state=3): >>># cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp <<< 28173 1726882748.20730: stdout chunk (state=3): >>># cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six <<< 28173 1726882748.20847: stdout chunk (state=3): >>># destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro<<< 28173 1726882748.20924: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version<<< 28173 1726882748.21009: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata<<< 28173 1726882748.21013: stdout chunk (state=3): >>> # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 28173 1726882748.21304: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 28173 1726882748.21323: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 28173 1726882748.21358: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 28173 1726882748.21383: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 28173 1726882748.21404: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 28173 1726882748.21427: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 28173 1726882748.21471: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 28173 1726882748.21540: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues <<< 28173 1726882748.21565: stdout chunk (state=3): >>># destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 28173 1726882748.21601: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction <<< 28173 1726882748.21618: stdout chunk (state=3): >>># destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 28173 1726882748.21674: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 28173 1726882748.21677: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 28173 1726882748.21722: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 28173 1726882748.21779: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal <<< 28173 1726882748.21833: stdout chunk (state=3): >>># cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 28173 1726882748.21916: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 28173 1726882748.21941: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 28173 1726882748.21956: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 28173 1726882748.21979: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 28173 1726882748.22161: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 28173 1726882748.22217: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 28173 1726882748.22230: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 28173 1726882748.22279: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 28173 1726882748.22755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882748.22758: stdout chunk (state=3): >>><<< 28173 1726882748.22761: stderr chunk (state=3): >>><<< 28173 1726882748.23211: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b51edc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b4c33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b51eb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b51eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b4c3490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b4c3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b4c3670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b47a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b47a220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b49d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b47a940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b4db880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b473d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b49dd90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b4c3970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1d3eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1d5f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1cc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1d2640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1d3370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191b08fdc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b08f8b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b08feb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b08ff70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b08fe80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1aed30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1a7610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1bb670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1dae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191b0a1c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1ae250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191b1bb280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1e09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a1fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a1d90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a1d00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b074370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b074460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a9fa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a3a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a3490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afc21c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b05fc70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a3eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b1e0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afd4af0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191afd4e20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afe6730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afe6c70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191af7e3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afd4f10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191af8f280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afe65b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191af8f340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a19d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191afaa6a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191afaa970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afaa760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191afaa850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191afaaca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191afb71f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afaa8e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191af9ea30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191b0a15b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191afaaa90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f191a9e7670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a925790> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a925160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a925280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a925ee0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a925fd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a925d00> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a925f40> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a925100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a8fb160> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a7fe0a0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a7fe280> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a7fec10> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a90cdc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a90c3d0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a90cf40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a95ac10> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a928ca0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a928370> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a8d9bb0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a928490> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a9284c0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a86c040> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a9058e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a8688e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a905160> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a905ca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a868880> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a930460> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a905a30> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a9053a0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a96c6d0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a87bd90> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a8ed9d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a85d610> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a87b100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a85da00> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a8798e0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a8a6940> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3ed970> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a8e3790> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3f1d90> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a8970a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3d3430> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a8a0160> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a89ccd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3f1bb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a16ca60> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3cb6d0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a3cbaf0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3b2250> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3b2a30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a401460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a401910> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a3fdd00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3fdd60> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a3fd250> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a1d4f70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a4164c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a401310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a0e3ca0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a0e3fd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a0e0b20> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a12f4f0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f191a05e190> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a05e2e0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_kdkvpxxt/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1919eded60> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1919edefa0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1919ea4cd0> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1919ede910> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a05b8b0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a0b7760> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f191a05ba90> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2807, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 725, "free": 2807}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 686, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238256128, "block_size": 4096, "block_total": 65519355, "block_available": 64511293, "block_used": 1008062, "inode_total": 131071472, "inode_available": 130998689, "inode_used": 72783, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "08", "epoch": "1726882748", "epoch_int": "1726882748", "date": "2024-09-20", "time": "21:39:08", "iso8601_micro": "2024-09-21T01:39:08.194259Z", "iso8601": "2024-09-21T01:39:08Z", "iso8601_basic": "20240920T213908194259", "iso8601_basic_short": "20240920T213908", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.43, "5m": 0.41, "15m": 0.24}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 28173 1726882748.24937: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882746.9031072-28195-41565633484875/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882748.24940: _low_level_execute_command(): starting 28173 1726882748.24943: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882746.9031072-28195-41565633484875/ > /dev/null 2>&1 && sleep 0' 28173 1726882748.26252: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882748.26482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882748.26496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882748.26513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882748.26554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882748.26571: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882748.26585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882748.26603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882748.26616: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882748.26628: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882748.26641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882748.26659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882748.26683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882748.26695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882748.26706: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882748.26719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882748.26796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882748.26819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882748.26884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882748.27014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882748.28944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882748.28970: stdout chunk (state=3): >>><<< 28173 1726882748.28973: stderr chunk (state=3): >>><<< 28173 1726882748.29071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882748.29075: handler run complete 28173 1726882748.29173: variable 'ansible_facts' from source: unknown 28173 1726882748.29261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882748.29627: variable 'ansible_facts' from source: unknown 28173 1726882748.29724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882748.30087: attempt loop complete, returning result 28173 1726882748.30096: _execute() done 28173 1726882748.30102: dumping result to json 28173 1726882748.30140: done dumping result, returning 28173 1726882748.30155: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-926c-8928-0000000000f5] 28173 1726882748.30167: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000f5 ok: [managed_node2] 28173 1726882748.31122: no more pending results, returning what we have 28173 1726882748.31125: results queue empty 28173 1726882748.31126: checking for any_errors_fatal 28173 1726882748.31127: done checking for any_errors_fatal 28173 1726882748.31128: checking for max_fail_percentage 28173 1726882748.31129: done checking for max_fail_percentage 28173 1726882748.31130: checking to see if all hosts have failed and the running result is not ok 28173 1726882748.31131: done checking to see if all hosts have failed 28173 1726882748.31132: getting the remaining hosts for this loop 28173 1726882748.31134: done getting the remaining hosts for this loop 28173 1726882748.31137: getting the next task for host managed_node2 28173 1726882748.31144: done getting next task for host managed_node2 28173 1726882748.31146: ^ task is: TASK: meta (flush_handlers) 28173 1726882748.31148: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882748.31160: getting variables 28173 1726882748.31163: in VariableManager get_vars() 28173 1726882748.31200: Calling all_inventory to load vars for managed_node2 28173 1726882748.31203: Calling groups_inventory to load vars for managed_node2 28173 1726882748.31207: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882748.31218: Calling all_plugins_play to load vars for managed_node2 28173 1726882748.31229: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882748.31233: Calling groups_plugins_play to load vars for managed_node2 28173 1726882748.31442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882748.31916: done with get_vars() 28173 1726882748.31927: done getting variables 28173 1726882748.32874: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000f5 28173 1726882748.32878: WORKER PROCESS EXITING 28173 1726882748.32931: in VariableManager get_vars() 28173 1726882748.32940: Calling all_inventory to load vars for managed_node2 28173 1726882748.32942: Calling groups_inventory to load vars for managed_node2 28173 1726882748.32945: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882748.32949: Calling all_plugins_play to load vars for managed_node2 28173 1726882748.32952: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882748.32958: Calling groups_plugins_play to load vars for managed_node2 28173 1726882748.33258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882748.33522: done with get_vars() 28173 1726882748.33534: done queuing things up, now waiting for results queue to drain 28173 1726882748.33536: results queue empty 28173 1726882748.33537: checking for any_errors_fatal 28173 1726882748.33539: done checking for any_errors_fatal 28173 1726882748.33540: checking for max_fail_percentage 28173 1726882748.33541: done checking for max_fail_percentage 28173 1726882748.33542: checking to see if all hosts have failed and the running result is not ok 28173 1726882748.33542: done checking to see if all hosts have failed 28173 1726882748.33543: getting the remaining hosts for this loop 28173 1726882748.33544: done getting the remaining hosts for this loop 28173 1726882748.33546: getting the next task for host managed_node2 28173 1726882748.33550: done getting next task for host managed_node2 28173 1726882748.33552: ^ task is: TASK: Include the task 'el_repo_setup.yml' 28173 1726882748.33553: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882748.33555: getting variables 28173 1726882748.33556: in VariableManager get_vars() 28173 1726882748.33562: Calling all_inventory to load vars for managed_node2 28173 1726882748.33567: Calling groups_inventory to load vars for managed_node2 28173 1726882748.33570: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882748.33574: Calling all_plugins_play to load vars for managed_node2 28173 1726882748.33576: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882748.33578: Calling groups_plugins_play to load vars for managed_node2 28173 1726882748.33716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882748.33929: done with get_vars() 28173 1726882748.33936: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:11 Friday 20 September 2024 21:39:08 -0400 (0:00:01.494) 0:00:01.505 ****** 28173 1726882748.34108: entering _queue_task() for managed_node2/include_tasks 28173 1726882748.34110: Creating lock for include_tasks 28173 1726882748.34575: worker is 1 (out of 1 available) 28173 1726882748.34588: exiting _queue_task() for managed_node2/include_tasks 28173 1726882748.34598: done queuing things up, now waiting for results queue to drain 28173 1726882748.34600: waiting for pending results... 28173 1726882748.35231: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 28173 1726882748.35352: in run() - task 0e448fcc-3ce9-926c-8928-000000000006 28173 1726882748.35380: variable 'ansible_search_path' from source: unknown 28173 1726882748.35431: calling self._execute() 28173 1726882748.35510: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882748.35525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882748.35545: variable 'omit' from source: magic vars 28173 1726882748.35682: _execute() done 28173 1726882748.35691: dumping result to json 28173 1726882748.35699: done dumping result, returning 28173 1726882748.35710: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-926c-8928-000000000006] 28173 1726882748.35723: sending task result for task 0e448fcc-3ce9-926c-8928-000000000006 28173 1726882748.35841: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000006 28173 1726882748.35885: no more pending results, returning what we have 28173 1726882748.35891: in VariableManager get_vars() 28173 1726882748.35922: Calling all_inventory to load vars for managed_node2 28173 1726882748.35925: Calling groups_inventory to load vars for managed_node2 28173 1726882748.35929: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882748.35940: Calling all_plugins_play to load vars for managed_node2 28173 1726882748.35944: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882748.35947: Calling groups_plugins_play to load vars for managed_node2 28173 1726882748.36139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882748.36379: done with get_vars() 28173 1726882748.36386: variable 'ansible_search_path' from source: unknown 28173 1726882748.36402: we have included files to process 28173 1726882748.36403: generating all_blocks data 28173 1726882748.36404: done generating all_blocks data 28173 1726882748.36405: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28173 1726882748.36407: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28173 1726882748.36409: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28173 1726882748.37113: WORKER PROCESS EXITING 28173 1726882748.37660: in VariableManager get_vars() 28173 1726882748.37677: done with get_vars() 28173 1726882748.38029: done processing included file 28173 1726882748.38032: iterating over new_blocks loaded from include file 28173 1726882748.38033: in VariableManager get_vars() 28173 1726882748.38050: done with get_vars() 28173 1726882748.38052: filtering new block on tags 28173 1726882748.38071: done filtering new block on tags 28173 1726882748.38074: in VariableManager get_vars() 28173 1726882748.38085: done with get_vars() 28173 1726882748.38087: filtering new block on tags 28173 1726882748.38101: done filtering new block on tags 28173 1726882748.38104: in VariableManager get_vars() 28173 1726882748.38113: done with get_vars() 28173 1726882748.38115: filtering new block on tags 28173 1726882748.38128: done filtering new block on tags 28173 1726882748.38217: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 28173 1726882748.38223: extending task lists for all hosts with included blocks 28173 1726882748.38275: done extending task lists 28173 1726882748.38277: done processing included files 28173 1726882748.38278: results queue empty 28173 1726882748.38278: checking for any_errors_fatal 28173 1726882748.38280: done checking for any_errors_fatal 28173 1726882748.38281: checking for max_fail_percentage 28173 1726882748.38282: done checking for max_fail_percentage 28173 1726882748.38282: checking to see if all hosts have failed and the running result is not ok 28173 1726882748.38283: done checking to see if all hosts have failed 28173 1726882748.38284: getting the remaining hosts for this loop 28173 1726882748.38285: done getting the remaining hosts for this loop 28173 1726882748.38288: getting the next task for host managed_node2 28173 1726882748.38292: done getting next task for host managed_node2 28173 1726882748.38294: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 28173 1726882748.38296: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882748.38298: getting variables 28173 1726882748.38300: in VariableManager get_vars() 28173 1726882748.38308: Calling all_inventory to load vars for managed_node2 28173 1726882748.38310: Calling groups_inventory to load vars for managed_node2 28173 1726882748.38312: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882748.38317: Calling all_plugins_play to load vars for managed_node2 28173 1726882748.38320: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882748.38322: Calling groups_plugins_play to load vars for managed_node2 28173 1726882748.38727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882748.39154: done with get_vars() 28173 1726882748.39171: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:39:08 -0400 (0:00:00.051) 0:00:01.556 ****** 28173 1726882748.39243: entering _queue_task() for managed_node2/setup 28173 1726882748.39507: worker is 1 (out of 1 available) 28173 1726882748.39519: exiting _queue_task() for managed_node2/setup 28173 1726882748.39528: done queuing things up, now waiting for results queue to drain 28173 1726882748.39530: waiting for pending results... 28173 1726882748.39781: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 28173 1726882748.39883: in run() - task 0e448fcc-3ce9-926c-8928-000000000106 28173 1726882748.39903: variable 'ansible_search_path' from source: unknown 28173 1726882748.39911: variable 'ansible_search_path' from source: unknown 28173 1726882748.39951: calling self._execute() 28173 1726882748.40026: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882748.40037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882748.40057: variable 'omit' from source: magic vars 28173 1726882748.40604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882748.43023: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882748.43091: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882748.43134: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882748.43174: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882748.43204: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882748.43286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882748.43335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882748.43388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882748.43435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882748.43457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882748.43629: variable 'ansible_facts' from source: unknown 28173 1726882748.43714: variable 'network_test_required_facts' from source: task vars 28173 1726882748.43755: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 28173 1726882748.43773: variable 'omit' from source: magic vars 28173 1726882748.43813: variable 'omit' from source: magic vars 28173 1726882748.43849: variable 'omit' from source: magic vars 28173 1726882748.43882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882748.43911: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882748.43932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882748.43952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882748.43968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882748.44002: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882748.44011: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882748.44018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882748.44116: Set connection var ansible_pipelining to False 28173 1726882748.44125: Set connection var ansible_shell_type to sh 28173 1726882748.44138: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882748.44165: Set connection var ansible_timeout to 10 28173 1726882748.44184: Set connection var ansible_shell_executable to /bin/sh 28173 1726882748.44205: Set connection var ansible_connection to ssh 28173 1726882748.44237: variable 'ansible_shell_executable' from source: unknown 28173 1726882748.44257: variable 'ansible_connection' from source: unknown 28173 1726882748.44273: variable 'ansible_module_compression' from source: unknown 28173 1726882748.44292: variable 'ansible_shell_type' from source: unknown 28173 1726882748.44318: variable 'ansible_shell_executable' from source: unknown 28173 1726882748.44335: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882748.44350: variable 'ansible_pipelining' from source: unknown 28173 1726882748.44356: variable 'ansible_timeout' from source: unknown 28173 1726882748.44364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882748.44548: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882748.44560: variable 'omit' from source: magic vars 28173 1726882748.44566: starting attempt loop 28173 1726882748.44582: running the handler 28173 1726882748.44601: _low_level_execute_command(): starting 28173 1726882748.44614: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882748.45113: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882748.45135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882748.45149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882748.45199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882748.45224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882748.45331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28173 1726882748.47640: stdout chunk (state=3): >>>/root <<< 28173 1726882748.47801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882748.47845: stderr chunk (state=3): >>><<< 28173 1726882748.47848: stdout chunk (state=3): >>><<< 28173 1726882748.47868: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28173 1726882748.47882: _low_level_execute_command(): starting 28173 1726882748.47887: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882748.4787-28260-48032118752334 `" && echo ansible-tmp-1726882748.4787-28260-48032118752334="` echo /root/.ansible/tmp/ansible-tmp-1726882748.4787-28260-48032118752334 `" ) && sleep 0' 28173 1726882748.48371: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882748.48375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882748.48377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882748.48396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882748.48427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882748.48438: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882748.48452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882748.48474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882748.48486: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882748.48497: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882748.48508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882748.48520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882748.48536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882748.48547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882748.48558: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882748.48576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882748.48646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882748.48674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882748.48693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882748.49014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28173 1726882748.51455: stdout chunk (state=3): >>>ansible-tmp-1726882748.4787-28260-48032118752334=/root/.ansible/tmp/ansible-tmp-1726882748.4787-28260-48032118752334 <<< 28173 1726882748.51714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882748.51717: stdout chunk (state=3): >>><<< 28173 1726882748.51720: stderr chunk (state=3): >>><<< 28173 1726882748.51771: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882748.4787-28260-48032118752334=/root/.ansible/tmp/ansible-tmp-1726882748.4787-28260-48032118752334 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28173 1726882748.51812: variable 'ansible_module_compression' from source: unknown 28173 1726882748.51977: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28173 1726882748.51981: variable 'ansible_facts' from source: unknown 28173 1726882748.52098: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882748.4787-28260-48032118752334/AnsiballZ_setup.py 28173 1726882748.52258: Sending initial data 28173 1726882748.52261: Sent initial data (150 bytes) 28173 1726882748.53188: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882748.53204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882748.53225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882748.53241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882748.53275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882748.53288: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882748.53301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882748.53323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882748.53335: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882748.53347: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882748.53359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882748.53375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882748.53389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882748.53403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882748.53416: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882748.53428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882748.53515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882748.53539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882748.53558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882748.53701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28173 1726882748.56212: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882748.56308: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882748.56406: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpusofyj6z /root/.ansible/tmp/ansible-tmp-1726882748.4787-28260-48032118752334/AnsiballZ_setup.py <<< 28173 1726882748.56506: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882748.59231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882748.59473: stderr chunk (state=3): >>><<< 28173 1726882748.59479: stdout chunk (state=3): >>><<< 28173 1726882748.59481: done transferring module to remote 28173 1726882748.59483: _low_level_execute_command(): starting 28173 1726882748.59486: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882748.4787-28260-48032118752334/ /root/.ansible/tmp/ansible-tmp-1726882748.4787-28260-48032118752334/AnsiballZ_setup.py && sleep 0' 28173 1726882748.60395: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882748.60414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882748.60440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882748.60457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882748.60496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882748.60499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882748.60501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882748.60556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882748.60571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882748.60671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28173 1726882748.62597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882748.62678: stderr chunk (state=3): >>><<< 28173 1726882748.62688: stdout chunk (state=3): >>><<< 28173 1726882748.62785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28173 1726882748.62788: _low_level_execute_command(): starting 28173 1726882748.62791: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882748.4787-28260-48032118752334/AnsiballZ_setup.py && sleep 0' 28173 1726882748.63728: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882748.63741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882748.63759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882748.63784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882748.63825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882748.63837: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882748.63854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882748.63877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882748.63890: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882748.63901: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882748.63911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882748.63923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882748.63937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882748.63949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882748.63966: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882748.63984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882748.64055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882748.64080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882748.64095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882748.64233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882748.66955: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 28173 1726882748.66978: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 28173 1726882748.67037: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 28173 1726882748.67076: stdout chunk (state=3): >>>import 'posix' # <<< 28173 1726882748.67104: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 28173 1726882748.67192: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 28173 1726882748.67225: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 28173 1726882748.67269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad73dc0> <<< 28173 1726882748.67317: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 28173 1726882748.67381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad73b20> <<< 28173 1726882748.67384: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad73ac0> <<< 28173 1726882748.67420: stdout chunk (state=3): >>>import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 28173 1726882748.67492: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad18490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 28173 1726882748.67510: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # <<< 28173 1726882748.67522: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad18940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad18670> <<< 28173 1726882748.67563: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 28173 1726882748.67577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 28173 1726882748.67608: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 28173 1726882748.67611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 28173 1726882748.67635: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 28173 1726882748.67650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 28173 1726882748.67662: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7accf190> <<< 28173 1726882748.67690: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 28173 1726882748.67702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 28173 1726882748.67795: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7accf220> <<< 28173 1726882748.67829: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 28173 1726882748.67842: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7acf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7accf940> <<< 28173 1726882748.67869: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad30880> <<< 28173 1726882748.67909: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 28173 1726882748.67913: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7acc8d90> <<< 28173 1726882748.67969: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 28173 1726882748.67972: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7acf2d90> <<< 28173 1726882748.68018: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad18970> <<< 28173 1726882748.68048: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28173 1726882748.68377: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 28173 1726882748.68404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 28173 1726882748.68424: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 28173 1726882748.68477: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 28173 1726882748.68510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 28173 1726882748.68514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac93eb0> <<< 28173 1726882748.68548: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac96f40> <<< 28173 1726882748.68585: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 28173 1726882748.68614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # <<< 28173 1726882748.68618: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 28173 1726882748.68637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 28173 1726882748.68644: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 28173 1726882748.68661: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac8c610> <<< 28173 1726882748.68687: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac92640> <<< 28173 1726882748.68692: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac93370> <<< 28173 1726882748.68723: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 28173 1726882748.68788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 28173 1726882748.69329: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a94ce20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a94c910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a94cf10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a94cfd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a95f0d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac6ed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac67670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac7a6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac9ae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a95fcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac6e2b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7ac7a2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7aca09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 28173 1726882748.69334: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 28173 1726882748.69522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a95feb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a95fdf0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a95fd60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 28173 1726882748.69547: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a9323d0> <<< 28173 1726882748.69572: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 28173 1726882748.69585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 28173 1726882748.69627: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a9324c0> <<< 28173 1726882748.69947: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a967f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a961a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a961490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 28173 1726882748.69950: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 28173 1726882748.69970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 28173 1726882748.69998: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 28173 1726882748.70008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 28173 1726882748.70018: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a866220> <<< 28173 1726882748.70052: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a91d520> <<< 28173 1726882748.70129: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a961f10> <<< 28173 1726882748.70132: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7aca0040> <<< 28173 1726882748.70158: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 28173 1726882748.70191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 28173 1726882748.70220: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 28173 1726882748.70235: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a878b50> <<< 28173 1726882748.70241: stdout chunk (state=3): >>>import 'errno' # <<< 28173 1726882748.70330: stdout chunk (state=3): >>> <<< 28173 1726882748.70368: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882748.70375: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a878e80> <<< 28173 1726882748.70386: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 28173 1726882748.70423: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 28173 1726882748.70430: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 28173 1726882748.70436: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a889790> <<< 28173 1726882748.70462: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 28173 1726882748.70502: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 28173 1726882748.70540: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a889cd0> <<< 28173 1726882748.70579: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a822400> <<< 28173 1726882748.70592: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a878f70> <<< 28173 1726882748.70615: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 28173 1726882748.70624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 28173 1726882748.70677: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882748.70692: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a8332e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a889610> <<< 28173 1726882748.70699: stdout chunk (state=3): >>>import 'pwd' # <<< 28173 1726882748.70727: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882748.70734: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a8333a0> <<< 28173 1726882748.70785: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a95fa30> <<< 28173 1726882748.70800: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 28173 1726882748.70826: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 28173 1726882748.70844: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 28173 1726882748.70868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 28173 1726882748.70905: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882748.70917: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a84e700> <<< 28173 1726882748.70929: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 28173 1726882748.70958: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882748.70965: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a84e9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a84e7c0> <<< 28173 1726882748.70993: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882748.71002: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a84e8b0> <<< 28173 1726882748.71031: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 28173 1726882748.71039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 28173 1726882748.71683: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a84ed00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a859250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a84e940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a842a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a95f610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a84eaf0> <<< 28173 1726882748.71726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 28173 1726882748.71761: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3e7a7726d0> <<< 28173 1726882748.72142: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip' # zipimport: zlib available <<< 28173 1726882748.72249: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.72293: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 28173 1726882748.72326: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 28173 1726882748.72338: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.73540: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.74470: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0ba820> <<< 28173 1726882748.74522: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 28173 1726882748.74551: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 28173 1726882748.74572: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a149730> <<< 28173 1726882748.74601: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a149610> <<< 28173 1726882748.74651: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a149340> <<< 28173 1726882748.74670: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 28173 1726882748.74703: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a149460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a149160> <<< 28173 1726882748.74751: stdout chunk (state=3): >>>import 'atexit' # <<< 28173 1726882748.74772: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a1493a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 28173 1726882748.74788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 28173 1726882748.74831: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a149790> <<< 28173 1726882748.74870: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 28173 1726882748.74910: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 28173 1726882748.74927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 28173 1726882748.75013: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a139820> <<< 28173 1726882748.75053: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a139490> <<< 28173 1726882748.75100: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a139640> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 28173 1726882748.75117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 28173 1726882748.75138: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a03f520> <<< 28173 1726882748.75160: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a144d60> <<< 28173 1726882748.75319: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1494f0> <<< 28173 1726882748.75355: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 28173 1726882748.75389: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1441c0> <<< 28173 1726882748.75421: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 28173 1726882748.75453: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 28173 1726882748.75483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 28173 1726882748.75497: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a148b20> <<< 28173 1726882748.75571: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a118160> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a118760> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a045d30> <<< 28173 1726882748.75602: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a118670> <<< 28173 1726882748.75629: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a19ad00> <<< 28173 1726882748.75663: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 28173 1726882748.75685: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 28173 1726882748.75723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 28173 1726882748.75794: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a09ca00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1a4e80> <<< 28173 1726882748.75824: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 28173 1726882748.75895: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a0aa0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1a4eb0> <<< 28173 1726882748.75909: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 28173 1726882748.75946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882748.75977: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 28173 1726882748.75993: stdout chunk (state=3): >>>import '_string' # <<< 28173 1726882748.76040: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1ac250> <<< 28173 1726882748.76173: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0aa0d0> <<< 28173 1726882748.76263: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a1aca60> <<< 28173 1726882748.76296: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a16eb80> <<< 28173 1726882748.76341: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a1a4cd0> <<< 28173 1726882748.76379: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a19aee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 28173 1726882748.76404: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 28173 1726882748.76415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 28173 1726882748.76454: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a0a60d0> <<< 28173 1726882748.76639: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a09d310> <<< 28173 1726882748.76694: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0a6cd0> <<< 28173 1726882748.76719: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a0a6670> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0a7100> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 28173 1726882748.76736: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.76807: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.76882: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.76927: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 28173 1726882748.76940: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.77033: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.77134: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.77587: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.78049: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 28173 1726882748.78080: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 28173 1726882748.78097: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882748.78147: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a0e5910> <<< 28173 1726882748.78229: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 28173 1726882748.78240: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0ea9a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79c58640> <<< 28173 1726882748.78295: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 28173 1726882748.78330: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.78342: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 28173 1726882748.78456: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.78584: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 28173 1726882748.78617: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1297f0> # zipimport: zlib available <<< 28173 1726882748.79016: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.79379: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.79433: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.79495: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 28173 1726882748.79539: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.79565: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 28173 1726882748.79621: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.79708: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 28173 1726882748.79730: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 28173 1726882748.79765: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.79805: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 28173 1726882748.79817: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.79990: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.80185: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 28173 1726882748.80216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 28173 1726882748.80285: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a166460> # zipimport: zlib available <<< 28173 1726882748.80345: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.80416: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 28173 1726882748.80440: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.80471: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.80512: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 28173 1726882748.80524: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.80551: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.80587: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.80683: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.80748: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 28173 1726882748.80771: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882748.80835: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a0d90d0> <<< 28173 1726882748.80929: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0ea1f0> <<< 28173 1726882748.80969: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 28173 1726882748.81025: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.81099: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.81117: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.81142: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 28173 1726882748.81161: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 28173 1726882748.81216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 28173 1726882748.81237: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 28173 1726882748.81254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 28173 1726882748.81321: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0ecbb0> <<< 28173 1726882748.81360: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1b5070> <<< 28173 1726882748.81419: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0dc2e0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 28173 1726882748.81447: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28173 1726882748.81471: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 28173 1726882748.81575: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available <<< 28173 1726882748.81588: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 28173 1726882748.81631: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.81695: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.81717: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.81735: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.81758: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.81799: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.81823: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.81860: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available <<< 28173 1726882748.81930: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.81997: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.82017: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.82048: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py <<< 28173 1726882748.82059: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.82206: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.82337: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.82371: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.82437: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882748.82469: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 28173 1726882748.82480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 28173 1726882748.82518: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79c0b400> <<< 28173 1726882748.82551: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 28173 1726882748.82579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 28173 1726882748.82616: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 28173 1726882748.82652: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79c6a9a0> <<< 28173 1726882748.82674: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e79c6adf0> <<< 28173 1726882748.82722: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79c68490> <<< 28173 1726882748.82764: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79ae3040> <<< 28173 1726882748.82789: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e799d33a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e799d35e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 28173 1726882748.82815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 28173 1726882748.82834: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 28173 1726882748.82877: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a0d86d0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79c57730> <<< 28173 1726882748.82905: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 28173 1726882748.82933: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0d85e0> <<< 28173 1726882748.82951: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 28173 1726882748.82986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 28173 1726882748.83011: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882748.83033: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e79c1ac70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79a329a0> <<< 28173 1726882748.83069: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e799d34f0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 28173 1726882748.83112: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 28173 1726882748.83123: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.83160: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.83222: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 28173 1726882748.83260: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.83379: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 28173 1726882748.83405: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 28173 1726882748.83419: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.83449: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.83492: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 28173 1726882748.83533: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.83572: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 28173 1726882748.83583: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.83625: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.83681: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.83722: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.83788: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 28173 1726882748.83800: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.84182: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.84541: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 28173 1726882748.84590: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.84636: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.84669: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.84701: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 28173 1726882748.84737: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.84783: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 28173 1726882748.84815: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.84869: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available <<< 28173 1726882748.84915: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.84929: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 28173 1726882748.84950: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.84996: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 28173 1726882748.84999: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.85051: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.85128: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 28173 1726882748.85153: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e799d39d0> <<< 28173 1726882748.85169: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 28173 1726882748.85195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 28173 1726882748.85371: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79952f40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 28173 1726882748.85374: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.85422: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.85479: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 28173 1726882748.85482: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.85551: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.85635: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 28173 1726882748.85641: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.85689: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.85767: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 28173 1726882748.85770: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.85796: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.85851: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 28173 1726882748.85867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 28173 1726882748.86011: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7994a3a0> <<< 28173 1726882748.86256: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79998100> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 28173 1726882748.86305: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.86353: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 28173 1726882748.86427: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.86493: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.86596: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.87172: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e798de670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e798deac0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 28173 1726882748.87359: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.87574: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 28173 1726882748.87577: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.88282: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 28173 1726882748.88428: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 28173 1726882748.88432: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 28173 1726882748.88444: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.88620: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.88792: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 28173 1726882748.88817: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.88869: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.88924: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.89523: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.90719: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 28173 1726882748.90739: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 28173 1726882748.91001: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 28173 1726882748.91144: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.91332: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py <<< 28173 1726882748.91449: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available <<< 28173 1726882748.91478: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 28173 1726882748.91538: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.91614: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 28173 1726882748.91617: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.91658: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.91661: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 28173 1726882748.91703: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.91773: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 28173 1726882748.91777: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.91806: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.91860: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 28173 1726882748.92080: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.92303: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 28173 1726882748.92307: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.92339: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.92395: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 28173 1726882748.92441: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.92469: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 28173 1726882748.92491: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.92529: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 28173 1726882748.92583: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.92597: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 28173 1726882748.92660: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.92739: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 28173 1726882748.92775: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 28173 1726882748.92778: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.92800: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.92859: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 28173 1726882748.92864: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.92890: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28173 1726882748.92928: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.92968: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.93026: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.93099: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 28173 1726882748.93117: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.93144: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.93194: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 28173 1726882748.93359: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.93542: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 28173 1726882748.93555: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.93574: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.93610: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 28173 1726882748.94163: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 28173 1726882748.94332: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882748.94682: stdout chunk (state=3): >>>import 'gc' # <<< 28173 1726882748.95211: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 28173 1726882748.95229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 28173 1726882748.95259: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 28173 1726882748.95295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 28173 1726882748.95395: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882748.95398: stdout chunk (state=3): >>>import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e79933c70> <<< 28173 1726882748.95412: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e798c5190> <<< 28173 1726882748.95514: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e798c5340> <<< 28173 1726882748.97307: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06<<< 28173 1726882748.97326: stdout chunk (state=3): >>>kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "08", "epoch": "1726882748", "epoch_int": "1726882748", "date": "2024-09-20", "time": "21:39:08", "iso8601_micro": "2024-09-21T01:39:08.969948Z", "iso8601": "2024-09-21T01:39:08Z", "iso8601_basic": "20240920T213908969948", "iso8601_basic_short": "20240920T213908", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28173 1726882748.97971: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 28173 1726882748.98090: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string <<< 28173 1726882748.98239: stdout chunk (state=3): >>># cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass <<< 28173 1726882748.98308: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 28173 1726882748.99045: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 28173 1726882748.99049: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 28173 1726882748.99077: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 <<< 28173 1726882748.99117: stdout chunk (state=3): >>># cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 28173 1726882748.99150: stdout chunk (state=3): >>># destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 28173 1726882748.99162: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 28173 1726882748.99241: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 28173 1726882748.99245: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 28173 1726882748.99263: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle <<< 28173 1726882748.99316: stdout chunk (state=3): >>># destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 28173 1726882748.99530: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 28173 1726882748.99577: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize <<< 28173 1726882748.99602: stdout chunk (state=3): >>># destroy _heapq <<< 28173 1726882748.99686: stdout chunk (state=3): >>># destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno<<< 28173 1726882748.99711: stdout chunk (state=3): >>> # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 28173 1726882748.99791: stdout chunk (state=3): >>># destroy select <<< 28173 1726882748.99843: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error <<< 28173 1726882748.99875: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 28173 1726882748.99943: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 28173 1726882749.00020: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 28173 1726882749.00566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882749.00570: stdout chunk (state=3): >>><<< 28173 1726882749.00572: stderr chunk (state=3): >>><<< 28173 1726882749.00810: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad73dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad73b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad73ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad18490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad18940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad18670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7accf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7accf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7acf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7accf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad30880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7acc8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7acf2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ad18970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac93eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac96f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac8c610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac92640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac93370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a94ce20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a94c910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a94cf10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a94cfd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a95f0d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac6ed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac67670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac7a6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac9ae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a95fcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7ac6e2b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7ac7a2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7aca09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a95feb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a95fdf0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a95fd60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a9323d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a9324c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a967f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a961a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a961490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a866220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a91d520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a961f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7aca0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a878b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a878e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a889790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a889cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a822400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a878f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a8332e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a889610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a8333a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a95fa30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a84e700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a84e9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a84e7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a84e8b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a84ed00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a859250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a84e940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a842a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a95f610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a84eaf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3e7a7726d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0ba820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a149730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a149610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a149340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a149460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a149160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a1493a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a149790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a139820> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a139490> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a139640> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a03f520> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a144d60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1494f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1441c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a148b20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a118160> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a118760> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a045d30> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a118670> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a19ad00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a09ca00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1a4e80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a0aa0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1a4eb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1ac250> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0aa0d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a1aca60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a16eb80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a1a4cd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a19aee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a0a60d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a09d310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0a6cd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a0a6670> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0a7100> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a0e5910> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0ea9a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79c58640> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1297f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a166460> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a0d90d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0ea1f0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0ecbb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a1b5070> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0dc2e0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79c0b400> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79c6a9a0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e79c6adf0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79c68490> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79ae3040> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e799d33a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e799d35e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7a0d86d0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79c57730> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e7a0d85e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e79c1ac70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79a329a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e799d34f0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e799d39d0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79952f40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e7994a3a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e79998100> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e798de670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e798deac0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_hjxriqbx/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3e79933c70> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e798c5190> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3e798c5340> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "08", "epoch": "1726882748", "epoch_int": "1726882748", "date": "2024-09-20", "time": "21:39:08", "iso8601_micro": "2024-09-21T01:39:08.969948Z", "iso8601": "2024-09-21T01:39:08Z", "iso8601_basic": "20240920T213908969948", "iso8601_basic_short": "20240920T213908", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 28173 1726882749.02095: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882748.4787-28260-48032118752334/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882749.02099: _low_level_execute_command(): starting 28173 1726882749.02101: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882748.4787-28260-48032118752334/ > /dev/null 2>&1 && sleep 0' 28173 1726882749.03120: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882749.03133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.03148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.03172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.03221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.03234: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882749.03252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.03272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882749.03284: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882749.03295: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882749.03307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.03321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.03334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.03344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.03354: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882749.03372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.03450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882749.03470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882749.03485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882749.03758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882749.06241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882749.06244: stdout chunk (state=3): >>><<< 28173 1726882749.06247: stderr chunk (state=3): >>><<< 28173 1726882749.06571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882749.06575: handler run complete 28173 1726882749.06578: variable 'ansible_facts' from source: unknown 28173 1726882749.06580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882749.06582: variable 'ansible_facts' from source: unknown 28173 1726882749.06584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882749.06605: attempt loop complete, returning result 28173 1726882749.06613: _execute() done 28173 1726882749.06620: dumping result to json 28173 1726882749.06636: done dumping result, returning 28173 1726882749.06649: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-926c-8928-000000000106] 28173 1726882749.06660: sending task result for task 0e448fcc-3ce9-926c-8928-000000000106 ok: [managed_node2] 28173 1726882749.06928: no more pending results, returning what we have 28173 1726882749.06930: results queue empty 28173 1726882749.06931: checking for any_errors_fatal 28173 1726882749.06933: done checking for any_errors_fatal 28173 1726882749.06933: checking for max_fail_percentage 28173 1726882749.06935: done checking for max_fail_percentage 28173 1726882749.06935: checking to see if all hosts have failed and the running result is not ok 28173 1726882749.06936: done checking to see if all hosts have failed 28173 1726882749.06937: getting the remaining hosts for this loop 28173 1726882749.06938: done getting the remaining hosts for this loop 28173 1726882749.06941: getting the next task for host managed_node2 28173 1726882749.06949: done getting next task for host managed_node2 28173 1726882749.06952: ^ task is: TASK: Check if system is ostree 28173 1726882749.06954: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882749.06957: getting variables 28173 1726882749.06959: in VariableManager get_vars() 28173 1726882749.06992: Calling all_inventory to load vars for managed_node2 28173 1726882749.06995: Calling groups_inventory to load vars for managed_node2 28173 1726882749.06999: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882749.07011: Calling all_plugins_play to load vars for managed_node2 28173 1726882749.07014: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882749.07018: Calling groups_plugins_play to load vars for managed_node2 28173 1726882749.07248: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000106 28173 1726882749.07252: WORKER PROCESS EXITING 28173 1726882749.07271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882749.07483: done with get_vars() 28173 1726882749.07495: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:39:09 -0400 (0:00:00.685) 0:00:02.242 ****** 28173 1726882749.07781: entering _queue_task() for managed_node2/stat 28173 1726882749.08240: worker is 1 (out of 1 available) 28173 1726882749.08251: exiting _queue_task() for managed_node2/stat 28173 1726882749.08262: done queuing things up, now waiting for results queue to drain 28173 1726882749.08269: waiting for pending results... 28173 1726882749.08599: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 28173 1726882749.08723: in run() - task 0e448fcc-3ce9-926c-8928-000000000108 28173 1726882749.08744: variable 'ansible_search_path' from source: unknown 28173 1726882749.08776: variable 'ansible_search_path' from source: unknown 28173 1726882749.08816: calling self._execute() 28173 1726882749.08910: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882749.08920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882749.08934: variable 'omit' from source: magic vars 28173 1726882749.10810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882749.11787: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882749.11836: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882749.11993: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882749.12033: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882749.12234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882749.12263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882749.12305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882749.12422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882749.12662: Evaluated conditional (not __network_is_ostree is defined): True 28173 1726882749.12680: variable 'omit' from source: magic vars 28173 1726882749.12834: variable 'omit' from source: magic vars 28173 1726882749.12885: variable 'omit' from source: magic vars 28173 1726882749.12916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882749.12991: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882749.13015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882749.13070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882749.13172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882749.13207: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882749.13215: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882749.13223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882749.13444: Set connection var ansible_pipelining to False 28173 1726882749.13452: Set connection var ansible_shell_type to sh 28173 1726882749.13469: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882749.13578: Set connection var ansible_timeout to 10 28173 1726882749.13597: Set connection var ansible_shell_executable to /bin/sh 28173 1726882749.13607: Set connection var ansible_connection to ssh 28173 1726882749.13633: variable 'ansible_shell_executable' from source: unknown 28173 1726882749.13641: variable 'ansible_connection' from source: unknown 28173 1726882749.13649: variable 'ansible_module_compression' from source: unknown 28173 1726882749.13655: variable 'ansible_shell_type' from source: unknown 28173 1726882749.13662: variable 'ansible_shell_executable' from source: unknown 28173 1726882749.13676: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882749.13684: variable 'ansible_pipelining' from source: unknown 28173 1726882749.13690: variable 'ansible_timeout' from source: unknown 28173 1726882749.13705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882749.14012: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882749.14082: variable 'omit' from source: magic vars 28173 1726882749.14092: starting attempt loop 28173 1726882749.14098: running the handler 28173 1726882749.14113: _low_level_execute_command(): starting 28173 1726882749.14145: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882749.16161: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882749.16182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.16198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.16223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.16272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.16327: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882749.16343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.16362: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882749.16380: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882749.16391: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882749.16403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.16418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.16442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.16456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.16473: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882749.16550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.16630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882749.16779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882749.16796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882749.16937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882749.19204: stdout chunk (state=3): >>>/root <<< 28173 1726882749.19472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882749.19476: stdout chunk (state=3): >>><<< 28173 1726882749.19493: stderr chunk (state=3): >>><<< 28173 1726882749.19615: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882749.19626: _low_level_execute_command(): starting 28173 1726882749.19629: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882749.1951478-28305-250048112074770 `" && echo ansible-tmp-1726882749.1951478-28305-250048112074770="` echo /root/.ansible/tmp/ansible-tmp-1726882749.1951478-28305-250048112074770 `" ) && sleep 0' 28173 1726882749.20245: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882749.20259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.20284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.20300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.20338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.20349: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882749.20360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.20389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882749.20402: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882749.20412: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882749.20423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.20434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.20446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.20456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.20470: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882749.20487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.20573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882749.20599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882749.20618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882749.20780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882749.23490: stdout chunk (state=3): >>>ansible-tmp-1726882749.1951478-28305-250048112074770=/root/.ansible/tmp/ansible-tmp-1726882749.1951478-28305-250048112074770 <<< 28173 1726882749.23649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882749.23725: stderr chunk (state=3): >>><<< 28173 1726882749.23729: stdout chunk (state=3): >>><<< 28173 1726882749.23872: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882749.1951478-28305-250048112074770=/root/.ansible/tmp/ansible-tmp-1726882749.1951478-28305-250048112074770 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882749.23876: variable 'ansible_module_compression' from source: unknown 28173 1726882749.23879: ANSIBALLZ: Using lock for stat 28173 1726882749.23881: ANSIBALLZ: Acquiring lock 28173 1726882749.23882: ANSIBALLZ: Lock acquired: 140243976271488 28173 1726882749.23983: ANSIBALLZ: Creating module 28173 1726882749.38892: ANSIBALLZ: Writing module into payload 28173 1726882749.39016: ANSIBALLZ: Writing module 28173 1726882749.39046: ANSIBALLZ: Renaming module 28173 1726882749.39055: ANSIBALLZ: Done creating module 28173 1726882749.39081: variable 'ansible_facts' from source: unknown 28173 1726882749.39170: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882749.1951478-28305-250048112074770/AnsiballZ_stat.py 28173 1726882749.39324: Sending initial data 28173 1726882749.39327: Sent initial data (153 bytes) 28173 1726882749.40286: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882749.40301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.40316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.40334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.40384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.40398: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882749.40412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.40430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882749.40442: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882749.40459: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882749.40476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.40492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.40509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.40522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.40533: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882749.40546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.40627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882749.40644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882749.40659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882749.40805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882749.43445: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882749.43550: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882749.43659: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp1c57208m /root/.ansible/tmp/ansible-tmp-1726882749.1951478-28305-250048112074770/AnsiballZ_stat.py <<< 28173 1726882749.43757: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882749.45131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882749.45416: stderr chunk (state=3): >>><<< 28173 1726882749.45420: stdout chunk (state=3): >>><<< 28173 1726882749.45422: done transferring module to remote 28173 1726882749.45424: _low_level_execute_command(): starting 28173 1726882749.45427: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882749.1951478-28305-250048112074770/ /root/.ansible/tmp/ansible-tmp-1726882749.1951478-28305-250048112074770/AnsiballZ_stat.py && sleep 0' 28173 1726882749.46068: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882749.46085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.46109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.46127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.46172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.46185: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882749.46202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.46227: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882749.46239: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882749.46249: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882749.46260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.46277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.46292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.46304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.46320: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882749.46340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.46423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882749.46451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882749.46470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882749.46605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882749.49318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882749.49457: stderr chunk (state=3): >>><<< 28173 1726882749.49518: stdout chunk (state=3): >>><<< 28173 1726882749.49549: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882749.49561: _low_level_execute_command(): starting 28173 1726882749.49571: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882749.1951478-28305-250048112074770/AnsiballZ_stat.py && sleep 0' 28173 1726882749.50351: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882749.50360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.50375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.50389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.50493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.50500: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882749.50514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.50530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882749.50544: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882749.50554: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882749.50561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.50581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.50600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.50612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.50626: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882749.50642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.50735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882749.50756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882749.50782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882749.50934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882749.53814: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 28173 1726882749.53819: stdout chunk (state=3): >>>import _imp # builtin <<< 28173 1726882749.53862: stdout chunk (state=3): >>>import '_thread' # <<< 28173 1726882749.53882: stdout chunk (state=3): >>>import '_warnings' # <<< 28173 1726882749.53888: stdout chunk (state=3): >>>import '_weakref' # <<< 28173 1726882749.53990: stdout chunk (state=3): >>>import '_io' # <<< 28173 1726882749.54017: stdout chunk (state=3): >>>import 'marshal' # <<< 28173 1726882749.54058: stdout chunk (state=3): >>>import 'posix' # <<< 28173 1726882749.54118: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 28173 1726882749.54131: stdout chunk (state=3): >>># installing zipimport hook <<< 28173 1726882749.54191: stdout chunk (state=3): >>>import 'time' # <<< 28173 1726882749.54223: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 28173 1726882749.54359: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 28173 1726882749.54567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88385f3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88385983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88385f3b20> <<< 28173 1726882749.54618: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 28173 1726882749.54622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 28173 1726882749.54672: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88385f3ac0> <<< 28173 1726882749.54686: stdout chunk (state=3): >>>import '_signal' # <<< 28173 1726882749.54742: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc'<<< 28173 1726882749.54745: stdout chunk (state=3): >>> <<< 28173 1726882749.54791: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838598490> <<< 28173 1726882749.54810: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 28173 1726882749.54853: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc'<<< 28173 1726882749.54868: stdout chunk (state=3): >>> <<< 28173 1726882749.54900: stdout chunk (state=3): >>>import '_abc' # <<< 28173 1726882749.54910: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838598940> <<< 28173 1726882749.54935: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838598670> <<< 28173 1726882749.54998: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 28173 1726882749.55016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 28173 1726882749.55040: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 28173 1726882749.55091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 28173 1726882749.55129: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 28173 1726882749.55157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 28173 1726882749.55224: stdout chunk (state=3): >>>import '_stat' # <<< 28173 1726882749.55227: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883854f190> <<< 28173 1726882749.55257: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 28173 1726882749.55304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc'<<< 28173 1726882749.55308: stdout chunk (state=3): >>> <<< 28173 1726882749.55418: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883854f220> <<< 28173 1726882749.55474: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 28173 1726882749.55536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 28173 1726882749.55562: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838572850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883854f940> <<< 28173 1726882749.55962: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88385b0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838548d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838572d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838598970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28173 1726882749.56316: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 28173 1726882749.56331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 28173 1726882749.56384: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 28173 1726882749.56397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 28173 1726882749.56428: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 28173 1726882749.56461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 28173 1726882749.56532: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 28173 1726882749.56536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 28173 1726882749.56553: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838513eb0> <<< 28173 1726882749.56630: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838516f40> <<< 28173 1726882749.56677: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 28173 1726882749.56714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # <<< 28173 1726882749.56741: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 28173 1726882749.56771: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 28173 1726882749.56828: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 28173 1726882749.56845: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 28173 1726882749.56858: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883850c610> <<< 28173 1726882749.56899: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838512640> <<< 28173 1726882749.56932: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838513370> <<< 28173 1726882749.56948: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 28173 1726882749.57059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 28173 1726882749.57094: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 28173 1726882749.57153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882749.57205: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 28173 1726882749.57208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 28173 1726882749.57295: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882749.57299: stdout chunk (state=3): >>>import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8838494df0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384948e0> <<< 28173 1726882749.57325: stdout chunk (state=3): >>>import 'itertools' # <<< 28173 1726882749.57399: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 28173 1726882749.57417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838494ee0> <<< 28173 1726882749.57431: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 28173 1726882749.57469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 28173 1726882749.57520: stdout chunk (state=3): >>>import '_operator' # <<< 28173 1726882749.57525: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838494fa0> <<< 28173 1726882749.57583: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc'<<< 28173 1726882749.57608: stdout chunk (state=3): >>> <<< 28173 1726882749.57621: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838494eb0> import '_collections' # <<< 28173 1726882749.57704: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384eed60> <<< 28173 1726882749.57719: stdout chunk (state=3): >>>import '_functools' # <<< 28173 1726882749.57750: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384e7640> <<< 28173 1726882749.57873: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384fa6a0><<< 28173 1726882749.57906: stdout chunk (state=3): >>> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883851adf0> <<< 28173 1726882749.57933: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 28173 1726882749.58032: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882749.58038: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88384a7ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384ee280> <<< 28173 1726882749.58101: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882749.58130: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88384fa2b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88385209a0> <<< 28173 1726882749.58178: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 28173 1726882749.58190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 28173 1726882749.58239: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py <<< 28173 1726882749.58242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882749.58275: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 28173 1726882749.58318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 28173 1726882749.58347: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a7fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a7dc0> <<< 28173 1726882749.58417: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc'<<< 28173 1726882749.58429: stdout chunk (state=3): >>> import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a7d30> <<< 28173 1726882749.58473: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 28173 1726882749.58486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 28173 1726882749.58515: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 28173 1726882749.58546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 28173 1726882749.58584: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 28173 1726882749.58700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 28173 1726882749.58721: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883847a3a0> <<< 28173 1726882749.58747: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 28173 1726882749.58786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 28173 1726882749.58827: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883847a490> <<< 28173 1726882749.59018: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384aefd0> <<< 28173 1726882749.59081: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a9a60> <<< 28173 1726882749.59094: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a9580> <<< 28173 1726882749.59145: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 28173 1726882749.59157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 28173 1726882749.59204: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 28173 1726882749.59255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 28173 1726882749.59299: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 28173 1726882749.59312: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88381881f0> <<< 28173 1726882749.59372: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838465b80> <<< 28173 1726882749.59494: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a9ee0> <<< 28173 1726882749.59498: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883851afd0> <<< 28173 1726882749.59510: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 28173 1726882749.59554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 28173 1726882749.59633: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883819ab20> <<< 28173 1726882749.59651: stdout chunk (state=3): >>>import 'errno' # <<< 28173 1726882749.59736: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883819ae50><<< 28173 1726882749.59776: stdout chunk (state=3): >>> <<< 28173 1726882749.59794: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 28173 1726882749.59840: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 28173 1726882749.59881: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88381ac760> <<< 28173 1726882749.59894: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 28173 1726882749.59937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 28173 1726882749.59995: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88381acca0> <<< 28173 1726882749.60081: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882749.60085: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88381393d0> <<< 28173 1726882749.60115: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883819af40> <<< 28173 1726882749.60143: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 28173 1726882749.60267: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883814a2b0> <<< 28173 1726882749.60271: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88381ac5e0> <<< 28173 1726882749.60300: stdout chunk (state=3): >>>import 'pwd' # <<< 28173 1726882749.60352: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883814a370><<< 28173 1726882749.60355: stdout chunk (state=3): >>> <<< 28173 1726882749.60407: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a7a00> <<< 28173 1726882749.60469: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 28173 1726882749.60483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 28173 1726882749.60514: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 28173 1726882749.60546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 28173 1726882749.60629: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882749.60672: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88381656d0> <<< 28173 1726882749.60701: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 28173 1726882749.60759: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88381659a0> <<< 28173 1726882749.60763: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838165790> <<< 28173 1726882749.60830: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8838165880> <<< 28173 1726882749.60879: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 28173 1726882749.60895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 28173 1726882749.61184: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8838165cd0> <<< 28173 1726882749.61258: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so'<<< 28173 1726882749.61296: stdout chunk (state=3): >>> import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8838172220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838165910> <<< 28173 1726882749.61310: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838159a60> <<< 28173 1726882749.61351: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a75e0> <<< 28173 1726882749.61391: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 28173 1726882749.61486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 28173 1726882749.61542: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838165ac0> <<< 28173 1726882749.61703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 28173 1726882749.61738: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f88380876a0> <<< 28173 1726882749.61978: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip' <<< 28173 1726882749.61981: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.62127: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.62215: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/__init__.py <<< 28173 1726882749.62219: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.62259: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 28173 1726882749.62284: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.64330: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.64333: stdout chunk (state=3): >>> <<< 28173 1726882749.65973: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 28173 1726882749.66002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fad7f0> <<< 28173 1726882749.66061: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882749.66115: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 28173 1726882749.66119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 28173 1726882749.66175: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 28173 1726882749.66236: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882749.66250: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837fad160> <<< 28173 1726882749.66317: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fad280> <<< 28173 1726882749.66606: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fadf40> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fad4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fadd60> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so'<<< 28173 1726882749.66626: stdout chunk (state=3): >>> import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837fadfa0> <<< 28173 1726882749.66923: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fad100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 28173 1726882749.66937: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 28173 1726882749.67057: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88379acf10> <<< 28173 1726882749.67140: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88379ccf10><<< 28173 1726882749.67144: stdout chunk (state=3): >>> <<< 28173 1726882749.67195: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882749.67213: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88379ccd30> <<< 28173 1726882749.67248: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py<<< 28173 1726882749.67251: stdout chunk (state=3): >>> <<< 28173 1726882749.67300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 28173 1726882749.67363: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88379cc3a0> <<< 28173 1726882749.67400: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838014dc0> <<< 28173 1726882749.67696: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88380143a0> <<< 28173 1726882749.67741: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 28173 1726882749.67782: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838014fa0> <<< 28173 1726882749.67832: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 28173 1726882749.67845: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 28173 1726882749.67900: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 28173 1726882749.67919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 28173 1726882749.67946: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 28173 1726882749.67960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 28173 1726882749.68003: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 28173 1726882749.68038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fe4c70><<< 28173 1726882749.68041: stdout chunk (state=3): >>> <<< 28173 1726882749.68182: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837f7fd00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837f7f3d0> <<< 28173 1726882749.68196: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fb54c0> <<< 28173 1726882749.68247: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882749.68262: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837f7f4f0> <<< 28173 1726882749.68325: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py<<< 28173 1726882749.68342: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837f7f520> <<< 28173 1726882749.68382: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 28173 1726882749.68414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 28173 1726882749.68449: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 28173 1726882749.68511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 28173 1726882749.68661: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883798e310> <<< 28173 1726882749.68686: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837ff5220> <<< 28173 1726882749.68711: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 28173 1726882749.68725: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 28173 1726882749.68821: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882749.68836: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883799a880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837ff53a0> <<< 28173 1726882749.68869: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 28173 1726882749.68930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 28173 1726882749.68972: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 28173 1726882749.69004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 28173 1726882749.69098: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883800ddc0> <<< 28173 1726882749.69324: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883799a820><<< 28173 1726882749.69327: stdout chunk (state=3): >>> <<< 28173 1726882749.69496: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882749.69500: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883799a670> <<< 28173 1726882749.69552: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so'<<< 28173 1726882749.69569: stdout chunk (state=3): >>> import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837999610> <<< 28173 1726882749.69668: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837999520> <<< 28173 1726882749.69682: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fec8e0> <<< 28173 1726882749.69733: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc'<<< 28173 1726882749.69736: stdout chunk (state=3): >>> <<< 28173 1726882749.69768: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 28173 1726882749.69806: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 28173 1726882749.69895: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so'<<< 28173 1726882749.69898: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837f766a0> <<< 28173 1726882749.70237: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837f74af0><<< 28173 1726882749.70258: stdout chunk (state=3): >>> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837f840a0> <<< 28173 1726882749.70327: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882749.70355: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837f76100> <<< 28173 1726882749.70386: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fb9ac0> <<< 28173 1726882749.70413: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.70427: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.70441: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 28173 1726882749.70454: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.70592: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.70760: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28173 1726882749.70828: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available <<< 28173 1726882749.70832: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 28173 1726882749.70849: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.70998: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.71154: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.71157: stdout chunk (state=3): >>> <<< 28173 1726882749.71975: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.72742: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py<<< 28173 1726882749.72756: stdout chunk (state=3): >>> <<< 28173 1726882749.72775: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 28173 1726882749.72796: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 28173 1726882749.72799: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py<<< 28173 1726882749.72804: stdout chunk (state=3): >>> <<< 28173 1726882749.72835: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py<<< 28173 1726882749.72840: stdout chunk (state=3): >>> <<< 28173 1726882749.72860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc'<<< 28173 1726882749.72869: stdout chunk (state=3): >>> <<< 28173 1726882749.72946: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so'<<< 28173 1726882749.72952: stdout chunk (state=3): >>> <<< 28173 1726882749.72966: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883759a5b0> <<< 28173 1726882749.73077: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py<<< 28173 1726882749.73080: stdout chunk (state=3): >>> <<< 28173 1726882749.73083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 28173 1726882749.73108: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883796b550> <<< 28173 1726882749.73126: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883753a0d0> <<< 28173 1726882749.73204: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py<<< 28173 1726882749.73214: stdout chunk (state=3): >>> <<< 28173 1726882749.73229: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.73234: stdout chunk (state=3): >>> <<< 28173 1726882749.73273: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.73330: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 28173 1726882749.73333: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.73525: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.73754: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 28173 1726882749.73756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 28173 1726882749.73792: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837f74be0> <<< 28173 1726882749.73819: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.74488: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.75115: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.75121: stdout chunk (state=3): >>> <<< 28173 1726882749.75214: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.75219: stdout chunk (state=3): >>> <<< 28173 1726882749.75315: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/collections.py<<< 28173 1726882749.75324: stdout chunk (state=3): >>> <<< 28173 1726882749.75341: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.75396: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.75401: stdout chunk (state=3): >>> <<< 28173 1726882749.75446: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py<<< 28173 1726882749.75455: stdout chunk (state=3): >>> <<< 28173 1726882749.75472: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.75477: stdout chunk (state=3): >>> <<< 28173 1726882749.75569: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.75574: stdout chunk (state=3): >>> <<< 28173 1726882749.75684: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/errors.py<<< 28173 1726882749.75689: stdout chunk (state=3): >>> <<< 28173 1726882749.75712: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.75730: stdout chunk (state=3): >>> <<< 28173 1726882749.75732: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.75742: stdout chunk (state=3): >>> <<< 28173 1726882749.75753: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 28173 1726882749.75783: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.75788: stdout chunk (state=3): >>> <<< 28173 1726882749.75842: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.75846: stdout chunk (state=3): >>> <<< 28173 1726882749.75895: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py<<< 28173 1726882749.75903: stdout chunk (state=3): >>> <<< 28173 1726882749.75921: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.75928: stdout chunk (state=3): >>> <<< 28173 1726882749.76267: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.76589: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 28173 1726882749.76637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 28173 1726882749.76668: stdout chunk (state=3): >>>import '_ast' # <<< 28173 1726882749.76830: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88379779a0> <<< 28173 1726882749.76845: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.76975: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.76980: stdout chunk (state=3): >>> <<< 28173 1726882749.77096: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py <<< 28173 1726882749.77117: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 28173 1726882749.77146: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 28173 1726882749.77184: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.77256: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.77325: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 28173 1726882749.77359: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.77456: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.77519: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.77647: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.77748: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py<<< 28173 1726882749.77767: stdout chunk (state=3): >>> <<< 28173 1726882749.77802: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc'<<< 28173 1726882749.77805: stdout chunk (state=3): >>> <<< 28173 1726882749.77936: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so'<<< 28173 1726882749.77955: stdout chunk (state=3): >>> # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 28173 1726882749.77960: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8838000250> <<< 28173 1726882749.78027: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837977f10><<< 28173 1726882749.78030: stdout chunk (state=3): >>> <<< 28173 1726882749.78096: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/file.py<<< 28173 1726882749.78099: stdout chunk (state=3): >>> import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/process.py<<< 28173 1726882749.78109: stdout chunk (state=3): >>> <<< 28173 1726882749.78122: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.78320: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.78404: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.78409: stdout chunk (state=3): >>> <<< 28173 1726882749.78443: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.78448: stdout chunk (state=3): >>> <<< 28173 1726882749.78502: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py<<< 28173 1726882749.78507: stdout chunk (state=3): >>> <<< 28173 1726882749.78536: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 28173 1726882749.78571: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py<<< 28173 1726882749.78575: stdout chunk (state=3): >>> <<< 28173 1726882749.78628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc'<<< 28173 1726882749.78638: stdout chunk (state=3): >>> <<< 28173 1726882749.78666: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 28173 1726882749.78697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc'<<< 28173 1726882749.78701: stdout chunk (state=3): >>> <<< 28173 1726882749.78844: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883795d7f0><<< 28173 1726882749.78851: stdout chunk (state=3): >>> <<< 28173 1726882749.78911: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837958820><<< 28173 1726882749.78916: stdout chunk (state=3): >>> <<< 28173 1726882749.79025: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837952a00><<< 28173 1726882749.79028: stdout chunk (state=3): >>> # destroy ansible.module_utils.distro<<< 28173 1726882749.79029: stdout chunk (state=3): >>> <<< 28173 1726882749.79031: stdout chunk (state=3): >>>import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 28173 1726882749.79052: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.79056: stdout chunk (state=3): >>> <<< 28173 1726882749.79096: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.79100: stdout chunk (state=3): >>> <<< 28173 1726882749.79144: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py<<< 28173 1726882749.79147: stdout chunk (state=3): >>> import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py<<< 28173 1726882749.79150: stdout chunk (state=3): >>> <<< 28173 1726882749.79255: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/basic.py<<< 28173 1726882749.79260: stdout chunk (state=3): >>> <<< 28173 1726882749.79285: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.79305: stdout chunk (state=3): >>># zipimport: zlib available<<< 28173 1726882749.79355: stdout chunk (state=3): >>> import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 28173 1726882749.79357: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.79542: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.79843: stdout chunk (state=3): >>># zipimport: zlib available <<< 28173 1726882749.80058: stdout chunk (state=3): >>> <<< 28173 1726882749.80087: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 28173 1726882749.80453: stdout chunk (state=3): >>># clear builtins._ <<< 28173 1726882749.80495: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1<<< 28173 1726882749.80502: stdout chunk (state=3): >>> # clear sys.ps2 # clear sys.last_type # clear sys.last_value<<< 28173 1726882749.80541: stdout chunk (state=3): >>> # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path <<< 28173 1726882749.80545: stdout chunk (state=3): >>># clear sys.__interactivehook__ <<< 28173 1726882749.80557: stdout chunk (state=3): >>># restore sys.stdin <<< 28173 1726882749.80572: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr <<< 28173 1726882749.80600: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins<<< 28173 1726882749.80617: stdout chunk (state=3): >>> # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread<<< 28173 1726882749.80636: stdout chunk (state=3): >>> # cleanup[2] removing _warnings <<< 28173 1726882749.80655: stdout chunk (state=3): >>># cleanup[2] removing _weakref # cleanup[2] removing _io <<< 28173 1726882749.80680: stdout chunk (state=3): >>># cleanup[2] removing marshal <<< 28173 1726882749.80709: stdout chunk (state=3): >>># cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external <<< 28173 1726882749.80734: stdout chunk (state=3): >>># cleanup[2] removing time <<< 28173 1726882749.80858: stdout chunk (state=3): >>># cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs<<< 28173 1726882749.80874: stdout chunk (state=3): >>> # cleanup[2] removing encodings.aliases <<< 28173 1726882749.80895: stdout chunk (state=3): >>># cleanup[2] removing encodings<<< 28173 1726882749.80920: stdout chunk (state=3): >>> # cleanup[2] removing encodings.utf_8<<< 28173 1726882749.80935: stdout chunk (state=3): >>> # cleanup[2] removing _signal <<< 28173 1726882749.80973: stdout chunk (state=3): >>># cleanup[2] removing encodings.latin_1 <<< 28173 1726882749.81002: stdout chunk (state=3): >>># cleanup[2] removing _abc # cleanup[2] removing abc<<< 28173 1726882749.81016: stdout chunk (state=3): >>> # cleanup[2] removing io<<< 28173 1726882749.81032: stdout chunk (state=3): >>> # cleanup[2] removing __main__ # cleanup[2] removing _stat<<< 28173 1726882749.81035: stdout chunk (state=3): >>> # cleanup[2] removing stat<<< 28173 1726882749.81048: stdout chunk (state=3): >>> <<< 28173 1726882749.81050: stdout chunk (state=3): >>># cleanup[2] removing _collections_abc <<< 28173 1726882749.81086: stdout chunk (state=3): >>># cleanup[2] removing genericpath # cleanup[2] removing posixpath <<< 28173 1726882749.81126: stdout chunk (state=3): >>># cleanup[2] removing os.path <<< 28173 1726882749.81217: stdout chunk (state=3): >>># cleanup[2] removing os # cleanup[2] removing _sitebuiltins<<< 28173 1726882749.81222: stdout chunk (state=3): >>> # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq <<< 28173 1726882749.81225: stdout chunk (state=3): >>># cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword<<< 28173 1726882749.81230: stdout chunk (state=3): >>> # destroy keyword<<< 28173 1726882749.81234: stdout chunk (state=3): >>> # cleanup[2] removing _operator<<< 28173 1726882749.81236: stdout chunk (state=3): >>> # cleanup[2] removing operator<<< 28173 1726882749.81240: stdout chunk (state=3): >>> # cleanup[2] removing reprlib<<< 28173 1726882749.81243: stdout chunk (state=3): >>> # destroy reprlib<<< 28173 1726882749.81245: stdout chunk (state=3): >>> # cleanup[2] removing _collections<<< 28173 1726882749.81248: stdout chunk (state=3): >>> # cleanup[2] removing collections <<< 28173 1726882749.81252: stdout chunk (state=3): >>># cleanup[2] removing _functools<<< 28173 1726882749.81255: stdout chunk (state=3): >>> # cleanup[2] removing functools<<< 28173 1726882749.81257: stdout chunk (state=3): >>> # cleanup[2] removing copyreg<<< 28173 1726882749.81323: stdout chunk (state=3): >>> # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64<<< 28173 1726882749.81328: stdout chunk (state=3): >>> # cleanup[2] removing importlib._bootstrap<<< 28173 1726882749.81331: stdout chunk (state=3): >>> # cleanup[2] removing importlib._bootstrap_external<<< 28173 1726882749.81333: stdout chunk (state=3): >>> # cleanup[2] removing warnings <<< 28173 1726882749.81361: stdout chunk (state=3): >>># cleanup[2] removing importlib # cleanup[2] removing importlib.machinery <<< 28173 1726882749.81367: stdout chunk (state=3): >>># cleanup[2] removing collections.abc <<< 28173 1726882749.81394: stdout chunk (state=3): >>># cleanup[2] removing contextlib <<< 28173 1726882749.81421: stdout chunk (state=3): >>># cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc <<< 28173 1726882749.81430: stdout chunk (state=3): >>># cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset<<< 28173 1726882749.81460: stdout chunk (state=3): >>> # destroy _weakrefset # cleanup[2] removing weakref <<< 28173 1726882749.81467: stdout chunk (state=3): >>># cleanup[2] removing pkgutil <<< 28173 1726882749.81492: stdout chunk (state=3): >>># destroy pkgutil <<< 28173 1726882749.81545: stdout chunk (state=3): >>># cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno<<< 28173 1726882749.81549: stdout chunk (state=3): >>> # cleanup[2] removing zlib <<< 28173 1726882749.81553: stdout chunk (state=3): >>># cleanup[2] removing _compression # cleanup[2] removing threading<<< 28173 1726882749.81576: stdout chunk (state=3): >>> # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2<<< 28173 1726882749.81750: stdout chunk (state=3): >>> # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd<<< 28173 1726882749.81756: stdout chunk (state=3): >>> # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect <<< 28173 1726882749.81759: stdout chunk (state=3): >>># cleanup[2] removing bisect # destroy bisect<<< 28173 1726882749.81761: stdout chunk (state=3): >>> # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile <<< 28173 1726882749.81771: stdout chunk (state=3): >>># cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils<<< 28173 1726882749.81774: stdout chunk (state=3): >>> # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder <<< 28173 1726882749.81777: stdout chunk (state=3): >>># cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string<<< 28173 1726882749.81790: stdout chunk (state=3): >>> # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat<<< 28173 1726882749.81817: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux<<< 28173 1726882749.81830: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings<<< 28173 1726882749.81838: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast <<< 28173 1726882749.81866: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4<<< 28173 1726882749.81879: stdout chunk (state=3): >>> # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse <<< 28173 1726882749.81893: stdout chunk (state=3): >>># cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 28173 1726882749.82157: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 28173 1726882749.82196: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 28173 1726882749.82235: stdout chunk (state=3): >>># destroy zipimport <<< 28173 1726882749.82313: stdout chunk (state=3): >>># destroy _compression <<< 28173 1726882749.82562: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array <<< 28173 1726882749.82570: stdout chunk (state=3): >>># destroy datetime <<< 28173 1726882749.82573: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 28173 1726882749.82638: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon <<< 28173 1726882749.82703: stdout chunk (state=3): >>># cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 28173 1726882749.82723: stdout chunk (state=3): >>># destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 28173 1726882749.82759: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 28173 1726882749.82806: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno <<< 28173 1726882749.82848: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 28173 1726882749.82861: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 28173 1726882749.82881: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 28173 1726882749.82896: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse<<< 28173 1726882749.82900: stdout chunk (state=3): >>> # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 28173 1726882749.82903: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc <<< 28173 1726882749.82906: stdout chunk (state=3): >>># cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal <<< 28173 1726882749.82909: stdout chunk (state=3): >>># cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 28173 1726882749.82911: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib <<< 28173 1726882749.82913: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader<<< 28173 1726882749.82916: stdout chunk (state=3): >>> # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 28173 1726882749.83098: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 28173 1726882749.83124: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq <<< 28173 1726882749.83128: stdout chunk (state=3): >>># destroy posixpath <<< 28173 1726882749.83130: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib <<< 28173 1726882749.83167: stdout chunk (state=3): >>># destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 28173 1726882749.83171: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 28173 1726882749.83174: stdout chunk (state=3): >>># destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 28173 1726882749.83217: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks<<< 28173 1726882749.83226: stdout chunk (state=3): >>> <<< 28173 1726882749.83675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882749.83678: stdout chunk (state=3): >>><<< 28173 1726882749.83688: stderr chunk (state=3): >>><<< 28173 1726882749.83743: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88385f3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88385983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88385f3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88385f3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838598490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838598940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838598670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883854f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883854f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838572850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883854f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88385b0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838548d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838572d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838598970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838513eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838516f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883850c610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838512640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838513370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8838494df0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384948e0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838494ee0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838494fa0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838494eb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384eed60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384e7640> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384fa6a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883851adf0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88384a7ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384ee280> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88384fa2b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88385209a0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a7fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a7dc0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a7d30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883847a3a0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883847a490> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384aefd0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a9a60> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a9580> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88381881f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838465b80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a9ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883851afd0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883819ab20> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883819ae50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88381ac760> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88381acca0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88381393d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883819af40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883814a2b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88381ac5e0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883814a370> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a7a00> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88381656d0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88381659a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838165790> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8838165880> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8838165cd0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8838172220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838165910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838159a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88384a75e0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838165ac0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f88380876a0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fad7f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837fad160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fad280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fadf40> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fad4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fadd60> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837fadfa0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fad100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88379acf10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88379ccf10> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88379ccd30> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88379cc3a0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838014dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88380143a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8838014fa0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fe4c70> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837f7fd00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837f7f3d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fb54c0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837f7f4f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837f7f520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883798e310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837ff5220> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883799a880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837ff53a0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883800ddc0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883799a820> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883799a670> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837999610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837999520> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fec8e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837f766a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837f74af0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837f840a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8837f76100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837fb9ac0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883759a5b0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883796b550> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883753a0d0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837f74be0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88379779a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8838000250> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837977f10> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883795d7f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837958820> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8837952a00> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_0ppce1jx/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 28173 1726882749.84198: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882749.1951478-28305-250048112074770/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882749.84207: _low_level_execute_command(): starting 28173 1726882749.84211: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882749.1951478-28305-250048112074770/ > /dev/null 2>&1 && sleep 0' 28173 1726882749.84572: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882749.84588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.84604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.84622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.84683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.84696: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882749.84709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.84725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882749.84741: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882749.84758: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882749.84779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882749.84792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882749.84806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882749.84818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882749.84829: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882749.84841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882749.84930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882749.84951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882749.84986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882749.85119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882749.87681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882749.87721: stderr chunk (state=3): >>><<< 28173 1726882749.87725: stdout chunk (state=3): >>><<< 28173 1726882749.87736: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882749.87742: handler run complete 28173 1726882749.87761: attempt loop complete, returning result 28173 1726882749.87769: _execute() done 28173 1726882749.87772: dumping result to json 28173 1726882749.87774: done dumping result, returning 28173 1726882749.87776: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0e448fcc-3ce9-926c-8928-000000000108] 28173 1726882749.87782: sending task result for task 0e448fcc-3ce9-926c-8928-000000000108 ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28173 1726882749.87928: no more pending results, returning what we have 28173 1726882749.87930: results queue empty 28173 1726882749.87931: checking for any_errors_fatal 28173 1726882749.87938: done checking for any_errors_fatal 28173 1726882749.87938: checking for max_fail_percentage 28173 1726882749.87940: done checking for max_fail_percentage 28173 1726882749.87940: checking to see if all hosts have failed and the running result is not ok 28173 1726882749.87941: done checking to see if all hosts have failed 28173 1726882749.87942: getting the remaining hosts for this loop 28173 1726882749.87943: done getting the remaining hosts for this loop 28173 1726882749.87946: getting the next task for host managed_node2 28173 1726882749.87951: done getting next task for host managed_node2 28173 1726882749.87954: ^ task is: TASK: Set flag to indicate system is ostree 28173 1726882749.87956: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882749.87960: getting variables 28173 1726882749.87962: in VariableManager get_vars() 28173 1726882749.87995: Calling all_inventory to load vars for managed_node2 28173 1726882749.87998: Calling groups_inventory to load vars for managed_node2 28173 1726882749.88001: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882749.88011: Calling all_plugins_play to load vars for managed_node2 28173 1726882749.88013: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882749.88016: Calling groups_plugins_play to load vars for managed_node2 28173 1726882749.88157: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000108 28173 1726882749.88162: WORKER PROCESS EXITING 28173 1726882749.88175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882749.88301: done with get_vars() 28173 1726882749.88309: done getting variables 28173 1726882749.88382: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:39:09 -0400 (0:00:00.806) 0:00:03.048 ****** 28173 1726882749.88406: entering _queue_task() for managed_node2/set_fact 28173 1726882749.88407: Creating lock for set_fact 28173 1726882749.88592: worker is 1 (out of 1 available) 28173 1726882749.88603: exiting _queue_task() for managed_node2/set_fact 28173 1726882749.88614: done queuing things up, now waiting for results queue to drain 28173 1726882749.88615: waiting for pending results... 28173 1726882749.88764: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 28173 1726882749.88830: in run() - task 0e448fcc-3ce9-926c-8928-000000000109 28173 1726882749.88840: variable 'ansible_search_path' from source: unknown 28173 1726882749.88843: variable 'ansible_search_path' from source: unknown 28173 1726882749.88883: calling self._execute() 28173 1726882749.88932: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882749.88936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882749.88945: variable 'omit' from source: magic vars 28173 1726882749.89283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882749.89511: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882749.89546: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882749.89573: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882749.89598: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882749.89660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882749.89681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882749.89698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882749.89717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882749.89805: Evaluated conditional (not __network_is_ostree is defined): True 28173 1726882749.89809: variable 'omit' from source: magic vars 28173 1726882749.89834: variable 'omit' from source: magic vars 28173 1726882749.89916: variable '__ostree_booted_stat' from source: set_fact 28173 1726882749.89952: variable 'omit' from source: magic vars 28173 1726882749.89976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882749.89994: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882749.90008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882749.90020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882749.90029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882749.90051: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882749.90054: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882749.90056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882749.90123: Set connection var ansible_pipelining to False 28173 1726882749.90126: Set connection var ansible_shell_type to sh 28173 1726882749.90132: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882749.90139: Set connection var ansible_timeout to 10 28173 1726882749.90143: Set connection var ansible_shell_executable to /bin/sh 28173 1726882749.90149: Set connection var ansible_connection to ssh 28173 1726882749.90165: variable 'ansible_shell_executable' from source: unknown 28173 1726882749.90170: variable 'ansible_connection' from source: unknown 28173 1726882749.90172: variable 'ansible_module_compression' from source: unknown 28173 1726882749.90176: variable 'ansible_shell_type' from source: unknown 28173 1726882749.90178: variable 'ansible_shell_executable' from source: unknown 28173 1726882749.90180: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882749.90186: variable 'ansible_pipelining' from source: unknown 28173 1726882749.90189: variable 'ansible_timeout' from source: unknown 28173 1726882749.90191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882749.90256: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882749.90266: variable 'omit' from source: magic vars 28173 1726882749.90273: starting attempt loop 28173 1726882749.90277: running the handler 28173 1726882749.90286: handler run complete 28173 1726882749.90294: attempt loop complete, returning result 28173 1726882749.90296: _execute() done 28173 1726882749.90301: dumping result to json 28173 1726882749.90303: done dumping result, returning 28173 1726882749.90311: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-926c-8928-000000000109] 28173 1726882749.90313: sending task result for task 0e448fcc-3ce9-926c-8928-000000000109 28173 1726882749.90384: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000109 28173 1726882749.90387: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 28173 1726882749.90443: no more pending results, returning what we have 28173 1726882749.90446: results queue empty 28173 1726882749.90447: checking for any_errors_fatal 28173 1726882749.90451: done checking for any_errors_fatal 28173 1726882749.90451: checking for max_fail_percentage 28173 1726882749.90453: done checking for max_fail_percentage 28173 1726882749.90454: checking to see if all hosts have failed and the running result is not ok 28173 1726882749.90454: done checking to see if all hosts have failed 28173 1726882749.90455: getting the remaining hosts for this loop 28173 1726882749.90457: done getting the remaining hosts for this loop 28173 1726882749.90459: getting the next task for host managed_node2 28173 1726882749.90468: done getting next task for host managed_node2 28173 1726882749.90471: ^ task is: TASK: Fix CentOS6 Base repo 28173 1726882749.90473: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882749.90476: getting variables 28173 1726882749.90477: in VariableManager get_vars() 28173 1726882749.90500: Calling all_inventory to load vars for managed_node2 28173 1726882749.90502: Calling groups_inventory to load vars for managed_node2 28173 1726882749.90509: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882749.90517: Calling all_plugins_play to load vars for managed_node2 28173 1726882749.90525: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882749.90532: Calling groups_plugins_play to load vars for managed_node2 28173 1726882749.90675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882749.90796: done with get_vars() 28173 1726882749.90803: done getting variables 28173 1726882749.90886: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:39:09 -0400 (0:00:00.024) 0:00:03.073 ****** 28173 1726882749.90904: entering _queue_task() for managed_node2/copy 28173 1726882749.91058: worker is 1 (out of 1 available) 28173 1726882749.91074: exiting _queue_task() for managed_node2/copy 28173 1726882749.91085: done queuing things up, now waiting for results queue to drain 28173 1726882749.91086: waiting for pending results... 28173 1726882749.91227: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 28173 1726882749.91295: in run() - task 0e448fcc-3ce9-926c-8928-00000000010b 28173 1726882749.91306: variable 'ansible_search_path' from source: unknown 28173 1726882749.91310: variable 'ansible_search_path' from source: unknown 28173 1726882749.91336: calling self._execute() 28173 1726882749.91391: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882749.91394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882749.91402: variable 'omit' from source: magic vars 28173 1726882749.91712: variable 'ansible_distribution' from source: facts 28173 1726882749.91730: Evaluated conditional (ansible_distribution == 'CentOS'): True 28173 1726882749.91814: variable 'ansible_distribution_major_version' from source: facts 28173 1726882749.91817: Evaluated conditional (ansible_distribution_major_version == '6'): False 28173 1726882749.91820: when evaluation is False, skipping this task 28173 1726882749.91822: _execute() done 28173 1726882749.91825: dumping result to json 28173 1726882749.91827: done dumping result, returning 28173 1726882749.91832: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-926c-8928-00000000010b] 28173 1726882749.91841: sending task result for task 0e448fcc-3ce9-926c-8928-00000000010b 28173 1726882749.91922: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000010b 28173 1726882749.91924: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 28173 1726882749.92005: no more pending results, returning what we have 28173 1726882749.92007: results queue empty 28173 1726882749.92008: checking for any_errors_fatal 28173 1726882749.92010: done checking for any_errors_fatal 28173 1726882749.92011: checking for max_fail_percentage 28173 1726882749.92011: done checking for max_fail_percentage 28173 1726882749.92012: checking to see if all hosts have failed and the running result is not ok 28173 1726882749.92013: done checking to see if all hosts have failed 28173 1726882749.92013: getting the remaining hosts for this loop 28173 1726882749.92014: done getting the remaining hosts for this loop 28173 1726882749.92016: getting the next task for host managed_node2 28173 1726882749.92019: done getting next task for host managed_node2 28173 1726882749.92021: ^ task is: TASK: Include the task 'enable_epel.yml' 28173 1726882749.92023: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882749.92025: getting variables 28173 1726882749.92026: in VariableManager get_vars() 28173 1726882749.92043: Calling all_inventory to load vars for managed_node2 28173 1726882749.92044: Calling groups_inventory to load vars for managed_node2 28173 1726882749.92046: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882749.92058: Calling all_plugins_play to load vars for managed_node2 28173 1726882749.92060: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882749.92062: Calling groups_plugins_play to load vars for managed_node2 28173 1726882749.92175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882749.92296: done with get_vars() 28173 1726882749.92302: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:39:09 -0400 (0:00:00.014) 0:00:03.087 ****** 28173 1726882749.92357: entering _queue_task() for managed_node2/include_tasks 28173 1726882749.92506: worker is 1 (out of 1 available) 28173 1726882749.92517: exiting _queue_task() for managed_node2/include_tasks 28173 1726882749.92526: done queuing things up, now waiting for results queue to drain 28173 1726882749.92528: waiting for pending results... 28173 1726882749.92669: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 28173 1726882749.92727: in run() - task 0e448fcc-3ce9-926c-8928-00000000010c 28173 1726882749.92737: variable 'ansible_search_path' from source: unknown 28173 1726882749.92740: variable 'ansible_search_path' from source: unknown 28173 1726882749.92770: calling self._execute() 28173 1726882749.92818: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882749.92822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882749.92828: variable 'omit' from source: magic vars 28173 1726882749.93198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882749.94683: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882749.94842: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882749.94870: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882749.94897: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882749.94916: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882749.94971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882749.94994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882749.95010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882749.95036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882749.95047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882749.95124: variable '__network_is_ostree' from source: set_fact 28173 1726882749.95136: Evaluated conditional (not __network_is_ostree | d(false)): True 28173 1726882749.95142: _execute() done 28173 1726882749.95144: dumping result to json 28173 1726882749.95147: done dumping result, returning 28173 1726882749.95152: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-926c-8928-00000000010c] 28173 1726882749.95158: sending task result for task 0e448fcc-3ce9-926c-8928-00000000010c 28173 1726882749.95237: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000010c 28173 1726882749.95240: WORKER PROCESS EXITING 28173 1726882749.95288: no more pending results, returning what we have 28173 1726882749.95292: in VariableManager get_vars() 28173 1726882749.95317: Calling all_inventory to load vars for managed_node2 28173 1726882749.95319: Calling groups_inventory to load vars for managed_node2 28173 1726882749.95322: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882749.95329: Calling all_plugins_play to load vars for managed_node2 28173 1726882749.95332: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882749.95334: Calling groups_plugins_play to load vars for managed_node2 28173 1726882749.95478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882749.95594: done with get_vars() 28173 1726882749.95599: variable 'ansible_search_path' from source: unknown 28173 1726882749.95600: variable 'ansible_search_path' from source: unknown 28173 1726882749.95622: we have included files to process 28173 1726882749.95623: generating all_blocks data 28173 1726882749.95624: done generating all_blocks data 28173 1726882749.95627: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28173 1726882749.95628: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28173 1726882749.95629: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28173 1726882749.96219: done processing included file 28173 1726882749.96221: iterating over new_blocks loaded from include file 28173 1726882749.96222: in VariableManager get_vars() 28173 1726882749.96232: done with get_vars() 28173 1726882749.96234: filtering new block on tags 28173 1726882749.96269: done filtering new block on tags 28173 1726882749.96273: in VariableManager get_vars() 28173 1726882749.96284: done with get_vars() 28173 1726882749.96286: filtering new block on tags 28173 1726882749.96298: done filtering new block on tags 28173 1726882749.96301: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 28173 1726882749.96306: extending task lists for all hosts with included blocks 28173 1726882749.96419: done extending task lists 28173 1726882749.96421: done processing included files 28173 1726882749.96422: results queue empty 28173 1726882749.96422: checking for any_errors_fatal 28173 1726882749.96425: done checking for any_errors_fatal 28173 1726882749.96425: checking for max_fail_percentage 28173 1726882749.96426: done checking for max_fail_percentage 28173 1726882749.96427: checking to see if all hosts have failed and the running result is not ok 28173 1726882749.96428: done checking to see if all hosts have failed 28173 1726882749.96429: getting the remaining hosts for this loop 28173 1726882749.96430: done getting the remaining hosts for this loop 28173 1726882749.96432: getting the next task for host managed_node2 28173 1726882749.96435: done getting next task for host managed_node2 28173 1726882749.96437: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 28173 1726882749.96440: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882749.96442: getting variables 28173 1726882749.96443: in VariableManager get_vars() 28173 1726882749.96450: Calling all_inventory to load vars for managed_node2 28173 1726882749.96452: Calling groups_inventory to load vars for managed_node2 28173 1726882749.96454: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882749.96459: Calling all_plugins_play to load vars for managed_node2 28173 1726882749.96470: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882749.96473: Calling groups_plugins_play to load vars for managed_node2 28173 1726882749.96657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882749.97013: done with get_vars() 28173 1726882749.97020: done getting variables 28173 1726882749.97070: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 28173 1726882749.97223: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:39:09 -0400 (0:00:00.048) 0:00:03.136 ****** 28173 1726882749.97255: entering _queue_task() for managed_node2/command 28173 1726882749.97256: Creating lock for command 28173 1726882749.97417: worker is 1 (out of 1 available) 28173 1726882749.97429: exiting _queue_task() for managed_node2/command 28173 1726882749.97439: done queuing things up, now waiting for results queue to drain 28173 1726882749.97441: waiting for pending results... 28173 1726882749.97581: running TaskExecutor() for managed_node2/TASK: Create EPEL 9 28173 1726882749.97646: in run() - task 0e448fcc-3ce9-926c-8928-000000000126 28173 1726882749.97654: variable 'ansible_search_path' from source: unknown 28173 1726882749.97658: variable 'ansible_search_path' from source: unknown 28173 1726882749.97693: calling self._execute() 28173 1726882749.97742: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882749.97745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882749.97752: variable 'omit' from source: magic vars 28173 1726882749.97994: variable 'ansible_distribution' from source: facts 28173 1726882749.98003: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28173 1726882749.98087: variable 'ansible_distribution_major_version' from source: facts 28173 1726882749.98090: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28173 1726882749.98095: when evaluation is False, skipping this task 28173 1726882749.98098: _execute() done 28173 1726882749.98100: dumping result to json 28173 1726882749.98102: done dumping result, returning 28173 1726882749.98109: done running TaskExecutor() for managed_node2/TASK: Create EPEL 9 [0e448fcc-3ce9-926c-8928-000000000126] 28173 1726882749.98113: sending task result for task 0e448fcc-3ce9-926c-8928-000000000126 28173 1726882749.98207: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000126 28173 1726882749.98210: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28173 1726882749.98261: no more pending results, returning what we have 28173 1726882749.98267: results queue empty 28173 1726882749.98268: checking for any_errors_fatal 28173 1726882749.98270: done checking for any_errors_fatal 28173 1726882749.98270: checking for max_fail_percentage 28173 1726882749.98272: done checking for max_fail_percentage 28173 1726882749.98272: checking to see if all hosts have failed and the running result is not ok 28173 1726882749.98273: done checking to see if all hosts have failed 28173 1726882749.98275: getting the remaining hosts for this loop 28173 1726882749.98276: done getting the remaining hosts for this loop 28173 1726882749.98278: getting the next task for host managed_node2 28173 1726882749.98281: done getting next task for host managed_node2 28173 1726882749.98283: ^ task is: TASK: Install yum-utils package 28173 1726882749.98285: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882749.98287: getting variables 28173 1726882749.98288: in VariableManager get_vars() 28173 1726882749.98305: Calling all_inventory to load vars for managed_node2 28173 1726882749.98307: Calling groups_inventory to load vars for managed_node2 28173 1726882749.98309: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882749.98315: Calling all_plugins_play to load vars for managed_node2 28173 1726882749.98317: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882749.98319: Calling groups_plugins_play to load vars for managed_node2 28173 1726882749.98448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882749.98594: done with get_vars() 28173 1726882749.98600: done getting variables 28173 1726882749.98661: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:39:09 -0400 (0:00:00.014) 0:00:03.151 ****** 28173 1726882749.98685: entering _queue_task() for managed_node2/package 28173 1726882749.98686: Creating lock for package 28173 1726882749.98839: worker is 1 (out of 1 available) 28173 1726882749.98849: exiting _queue_task() for managed_node2/package 28173 1726882749.98860: done queuing things up, now waiting for results queue to drain 28173 1726882749.98861: waiting for pending results... 28173 1726882749.99087: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 28173 1726882749.99261: in run() - task 0e448fcc-3ce9-926c-8928-000000000127 28173 1726882749.99284: variable 'ansible_search_path' from source: unknown 28173 1726882749.99294: variable 'ansible_search_path' from source: unknown 28173 1726882749.99332: calling self._execute() 28173 1726882749.99413: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882749.99423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882749.99436: variable 'omit' from source: magic vars 28173 1726882749.99797: variable 'ansible_distribution' from source: facts 28173 1726882749.99815: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28173 1726882749.99947: variable 'ansible_distribution_major_version' from source: facts 28173 1726882749.99956: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28173 1726882749.99962: when evaluation is False, skipping this task 28173 1726882749.99972: _execute() done 28173 1726882749.99980: dumping result to json 28173 1726882749.99986: done dumping result, returning 28173 1726882749.99994: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0e448fcc-3ce9-926c-8928-000000000127] 28173 1726882750.00002: sending task result for task 0e448fcc-3ce9-926c-8928-000000000127 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28173 1726882750.00127: no more pending results, returning what we have 28173 1726882750.00130: results queue empty 28173 1726882750.00131: checking for any_errors_fatal 28173 1726882750.00137: done checking for any_errors_fatal 28173 1726882750.00138: checking for max_fail_percentage 28173 1726882750.00140: done checking for max_fail_percentage 28173 1726882750.00140: checking to see if all hosts have failed and the running result is not ok 28173 1726882750.00141: done checking to see if all hosts have failed 28173 1726882750.00142: getting the remaining hosts for this loop 28173 1726882750.00143: done getting the remaining hosts for this loop 28173 1726882750.00146: getting the next task for host managed_node2 28173 1726882750.00152: done getting next task for host managed_node2 28173 1726882750.00154: ^ task is: TASK: Enable EPEL 7 28173 1726882750.00159: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882750.00162: getting variables 28173 1726882750.00167: in VariableManager get_vars() 28173 1726882750.00193: Calling all_inventory to load vars for managed_node2 28173 1726882750.00196: Calling groups_inventory to load vars for managed_node2 28173 1726882750.00199: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882750.00211: Calling all_plugins_play to load vars for managed_node2 28173 1726882750.00215: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882750.00218: Calling groups_plugins_play to load vars for managed_node2 28173 1726882750.00406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882750.00623: done with get_vars() 28173 1726882750.00634: done getting variables 28173 1726882750.00707: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:39:10 -0400 (0:00:00.020) 0:00:03.171 ****** 28173 1726882750.00738: entering _queue_task() for managed_node2/command 28173 1726882750.00756: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000127 28173 1726882750.00767: WORKER PROCESS EXITING 28173 1726882750.01118: worker is 1 (out of 1 available) 28173 1726882750.01129: exiting _queue_task() for managed_node2/command 28173 1726882750.01139: done queuing things up, now waiting for results queue to drain 28173 1726882750.01140: waiting for pending results... 28173 1726882750.01360: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 28173 1726882750.01471: in run() - task 0e448fcc-3ce9-926c-8928-000000000128 28173 1726882750.01491: variable 'ansible_search_path' from source: unknown 28173 1726882750.01498: variable 'ansible_search_path' from source: unknown 28173 1726882750.01535: calling self._execute() 28173 1726882750.01610: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882750.01621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882750.01633: variable 'omit' from source: magic vars 28173 1726882750.01985: variable 'ansible_distribution' from source: facts 28173 1726882750.02002: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28173 1726882750.02140: variable 'ansible_distribution_major_version' from source: facts 28173 1726882750.02150: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28173 1726882750.02158: when evaluation is False, skipping this task 28173 1726882750.02169: _execute() done 28173 1726882750.02176: dumping result to json 28173 1726882750.02184: done dumping result, returning 28173 1726882750.02194: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0e448fcc-3ce9-926c-8928-000000000128] 28173 1726882750.02205: sending task result for task 0e448fcc-3ce9-926c-8928-000000000128 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28173 1726882750.02334: no more pending results, returning what we have 28173 1726882750.02337: results queue empty 28173 1726882750.02338: checking for any_errors_fatal 28173 1726882750.02346: done checking for any_errors_fatal 28173 1726882750.02347: checking for max_fail_percentage 28173 1726882750.02348: done checking for max_fail_percentage 28173 1726882750.02349: checking to see if all hosts have failed and the running result is not ok 28173 1726882750.02350: done checking to see if all hosts have failed 28173 1726882750.02351: getting the remaining hosts for this loop 28173 1726882750.02352: done getting the remaining hosts for this loop 28173 1726882750.02356: getting the next task for host managed_node2 28173 1726882750.02362: done getting next task for host managed_node2 28173 1726882750.02368: ^ task is: TASK: Enable EPEL 8 28173 1726882750.02373: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882750.02377: getting variables 28173 1726882750.02378: in VariableManager get_vars() 28173 1726882750.02406: Calling all_inventory to load vars for managed_node2 28173 1726882750.02408: Calling groups_inventory to load vars for managed_node2 28173 1726882750.02412: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882750.02424: Calling all_plugins_play to load vars for managed_node2 28173 1726882750.02427: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882750.02430: Calling groups_plugins_play to load vars for managed_node2 28173 1726882750.02853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882750.03230: done with get_vars() 28173 1726882750.03241: done getting variables 28173 1726882750.03270: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000128 28173 1726882750.03273: WORKER PROCESS EXITING 28173 1726882750.03303: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:39:10 -0400 (0:00:00.025) 0:00:03.197 ****** 28173 1726882750.03327: entering _queue_task() for managed_node2/command 28173 1726882750.03513: worker is 1 (out of 1 available) 28173 1726882750.03522: exiting _queue_task() for managed_node2/command 28173 1726882750.03532: done queuing things up, now waiting for results queue to drain 28173 1726882750.03533: waiting for pending results... 28173 1726882750.03744: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 28173 1726882750.03858: in run() - task 0e448fcc-3ce9-926c-8928-000000000129 28173 1726882750.03884: variable 'ansible_search_path' from source: unknown 28173 1726882750.03891: variable 'ansible_search_path' from source: unknown 28173 1726882750.03926: calling self._execute() 28173 1726882750.03998: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882750.04009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882750.04021: variable 'omit' from source: magic vars 28173 1726882750.04377: variable 'ansible_distribution' from source: facts 28173 1726882750.04393: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28173 1726882750.04523: variable 'ansible_distribution_major_version' from source: facts 28173 1726882750.04533: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28173 1726882750.04540: when evaluation is False, skipping this task 28173 1726882750.04546: _execute() done 28173 1726882750.04552: dumping result to json 28173 1726882750.04558: done dumping result, returning 28173 1726882750.04573: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0e448fcc-3ce9-926c-8928-000000000129] 28173 1726882750.04584: sending task result for task 0e448fcc-3ce9-926c-8928-000000000129 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28173 1726882750.04710: no more pending results, returning what we have 28173 1726882750.04713: results queue empty 28173 1726882750.04714: checking for any_errors_fatal 28173 1726882750.04720: done checking for any_errors_fatal 28173 1726882750.04721: checking for max_fail_percentage 28173 1726882750.04722: done checking for max_fail_percentage 28173 1726882750.04723: checking to see if all hosts have failed and the running result is not ok 28173 1726882750.04724: done checking to see if all hosts have failed 28173 1726882750.04725: getting the remaining hosts for this loop 28173 1726882750.04726: done getting the remaining hosts for this loop 28173 1726882750.04729: getting the next task for host managed_node2 28173 1726882750.04737: done getting next task for host managed_node2 28173 1726882750.04740: ^ task is: TASK: Enable EPEL 6 28173 1726882750.04744: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882750.04747: getting variables 28173 1726882750.04750: in VariableManager get_vars() 28173 1726882750.04781: Calling all_inventory to load vars for managed_node2 28173 1726882750.04784: Calling groups_inventory to load vars for managed_node2 28173 1726882750.04787: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882750.04798: Calling all_plugins_play to load vars for managed_node2 28173 1726882750.04802: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882750.04805: Calling groups_plugins_play to load vars for managed_node2 28173 1726882750.04997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882750.05214: done with get_vars() 28173 1726882750.05222: done getting variables 28173 1726882750.05289: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:39:10 -0400 (0:00:00.019) 0:00:03.217 ****** 28173 1726882750.05318: entering _queue_task() for managed_node2/copy 28173 1726882750.05334: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000129 28173 1726882750.05344: WORKER PROCESS EXITING 28173 1726882750.05668: worker is 1 (out of 1 available) 28173 1726882750.05680: exiting _queue_task() for managed_node2/copy 28173 1726882750.05690: done queuing things up, now waiting for results queue to drain 28173 1726882750.05692: waiting for pending results... 28173 1726882750.05915: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 28173 1726882750.06024: in run() - task 0e448fcc-3ce9-926c-8928-00000000012b 28173 1726882750.06046: variable 'ansible_search_path' from source: unknown 28173 1726882750.06052: variable 'ansible_search_path' from source: unknown 28173 1726882750.06088: calling self._execute() 28173 1726882750.06157: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882750.06172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882750.06185: variable 'omit' from source: magic vars 28173 1726882750.06818: variable 'ansible_distribution' from source: facts 28173 1726882750.06835: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28173 1726882750.07096: variable 'ansible_distribution_major_version' from source: facts 28173 1726882750.07154: Evaluated conditional (ansible_distribution_major_version == '6'): False 28173 1726882750.07175: when evaluation is False, skipping this task 28173 1726882750.07194: _execute() done 28173 1726882750.07201: dumping result to json 28173 1726882750.07208: done dumping result, returning 28173 1726882750.07217: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0e448fcc-3ce9-926c-8928-00000000012b] 28173 1726882750.07234: sending task result for task 0e448fcc-3ce9-926c-8928-00000000012b skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 28173 1726882750.07406: no more pending results, returning what we have 28173 1726882750.07409: results queue empty 28173 1726882750.07410: checking for any_errors_fatal 28173 1726882750.07414: done checking for any_errors_fatal 28173 1726882750.07415: checking for max_fail_percentage 28173 1726882750.07417: done checking for max_fail_percentage 28173 1726882750.07417: checking to see if all hosts have failed and the running result is not ok 28173 1726882750.07418: done checking to see if all hosts have failed 28173 1726882750.07419: getting the remaining hosts for this loop 28173 1726882750.07421: done getting the remaining hosts for this loop 28173 1726882750.07424: getting the next task for host managed_node2 28173 1726882750.07434: done getting next task for host managed_node2 28173 1726882750.07437: ^ task is: TASK: Set network provider to 'nm' 28173 1726882750.07439: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882750.07444: getting variables 28173 1726882750.07446: in VariableManager get_vars() 28173 1726882750.07480: Calling all_inventory to load vars for managed_node2 28173 1726882750.07483: Calling groups_inventory to load vars for managed_node2 28173 1726882750.07487: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882750.07501: Calling all_plugins_play to load vars for managed_node2 28173 1726882750.07504: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882750.07508: Calling groups_plugins_play to load vars for managed_node2 28173 1726882750.07733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882750.07971: done with get_vars() 28173 1726882750.07981: done getting variables 28173 1726882750.08054: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:13 Friday 20 September 2024 21:39:10 -0400 (0:00:00.027) 0:00:03.245 ****** 28173 1726882750.08090: entering _queue_task() for managed_node2/set_fact 28173 1726882750.08177: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000012b 28173 1726882750.08185: WORKER PROCESS EXITING 28173 1726882750.08833: worker is 1 (out of 1 available) 28173 1726882750.08845: exiting _queue_task() for managed_node2/set_fact 28173 1726882750.08857: done queuing things up, now waiting for results queue to drain 28173 1726882750.08858: waiting for pending results... 28173 1726882750.09757: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 28173 1726882750.09847: in run() - task 0e448fcc-3ce9-926c-8928-000000000007 28173 1726882750.09887: variable 'ansible_search_path' from source: unknown 28173 1726882750.10002: calling self._execute() 28173 1726882750.10189: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882750.10205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882750.10218: variable 'omit' from source: magic vars 28173 1726882750.10442: variable 'omit' from source: magic vars 28173 1726882750.10481: variable 'omit' from source: magic vars 28173 1726882750.10559: variable 'omit' from source: magic vars 28173 1726882750.10657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882750.10762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882750.10788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882750.10855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882750.10953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882750.10989: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882750.10996: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882750.11002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882750.11121: Set connection var ansible_pipelining to False 28173 1726882750.11272: Set connection var ansible_shell_type to sh 28173 1726882750.11290: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882750.11302: Set connection var ansible_timeout to 10 28173 1726882750.11310: Set connection var ansible_shell_executable to /bin/sh 28173 1726882750.11318: Set connection var ansible_connection to ssh 28173 1726882750.11342: variable 'ansible_shell_executable' from source: unknown 28173 1726882750.11351: variable 'ansible_connection' from source: unknown 28173 1726882750.11357: variable 'ansible_module_compression' from source: unknown 28173 1726882750.11366: variable 'ansible_shell_type' from source: unknown 28173 1726882750.11377: variable 'ansible_shell_executable' from source: unknown 28173 1726882750.11384: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882750.11393: variable 'ansible_pipelining' from source: unknown 28173 1726882750.11403: variable 'ansible_timeout' from source: unknown 28173 1726882750.11489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882750.11753: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882750.11774: variable 'omit' from source: magic vars 28173 1726882750.11784: starting attempt loop 28173 1726882750.11791: running the handler 28173 1726882750.11821: handler run complete 28173 1726882750.11928: attempt loop complete, returning result 28173 1726882750.11936: _execute() done 28173 1726882750.11946: dumping result to json 28173 1726882750.11954: done dumping result, returning 28173 1726882750.11967: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0e448fcc-3ce9-926c-8928-000000000007] 28173 1726882750.11978: sending task result for task 0e448fcc-3ce9-926c-8928-000000000007 28173 1726882750.12084: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000007 28173 1726882750.12092: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 28173 1726882750.12142: no more pending results, returning what we have 28173 1726882750.12144: results queue empty 28173 1726882750.12145: checking for any_errors_fatal 28173 1726882750.12150: done checking for any_errors_fatal 28173 1726882750.12151: checking for max_fail_percentage 28173 1726882750.12152: done checking for max_fail_percentage 28173 1726882750.12153: checking to see if all hosts have failed and the running result is not ok 28173 1726882750.12153: done checking to see if all hosts have failed 28173 1726882750.12154: getting the remaining hosts for this loop 28173 1726882750.12158: done getting the remaining hosts for this loop 28173 1726882750.12161: getting the next task for host managed_node2 28173 1726882750.12170: done getting next task for host managed_node2 28173 1726882750.12172: ^ task is: TASK: meta (flush_handlers) 28173 1726882750.12174: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882750.12179: getting variables 28173 1726882750.12180: in VariableManager get_vars() 28173 1726882750.12209: Calling all_inventory to load vars for managed_node2 28173 1726882750.12211: Calling groups_inventory to load vars for managed_node2 28173 1726882750.12215: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882750.12225: Calling all_plugins_play to load vars for managed_node2 28173 1726882750.12228: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882750.12231: Calling groups_plugins_play to load vars for managed_node2 28173 1726882750.12533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882750.12791: done with get_vars() 28173 1726882750.12800: done getting variables 28173 1726882750.12869: in VariableManager get_vars() 28173 1726882750.12878: Calling all_inventory to load vars for managed_node2 28173 1726882750.12880: Calling groups_inventory to load vars for managed_node2 28173 1726882750.12882: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882750.12887: Calling all_plugins_play to load vars for managed_node2 28173 1726882750.12889: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882750.12892: Calling groups_plugins_play to load vars for managed_node2 28173 1726882750.13306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882750.13846: done with get_vars() 28173 1726882750.13859: done queuing things up, now waiting for results queue to drain 28173 1726882750.13861: results queue empty 28173 1726882750.13862: checking for any_errors_fatal 28173 1726882750.13868: done checking for any_errors_fatal 28173 1726882750.13869: checking for max_fail_percentage 28173 1726882750.13870: done checking for max_fail_percentage 28173 1726882750.13871: checking to see if all hosts have failed and the running result is not ok 28173 1726882750.13872: done checking to see if all hosts have failed 28173 1726882750.13872: getting the remaining hosts for this loop 28173 1726882750.13874: done getting the remaining hosts for this loop 28173 1726882750.13876: getting the next task for host managed_node2 28173 1726882750.13880: done getting next task for host managed_node2 28173 1726882750.13881: ^ task is: TASK: meta (flush_handlers) 28173 1726882750.13882: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882750.13890: getting variables 28173 1726882750.13891: in VariableManager get_vars() 28173 1726882750.13898: Calling all_inventory to load vars for managed_node2 28173 1726882750.13900: Calling groups_inventory to load vars for managed_node2 28173 1726882750.13903: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882750.13907: Calling all_plugins_play to load vars for managed_node2 28173 1726882750.14055: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882750.14060: Calling groups_plugins_play to load vars for managed_node2 28173 1726882750.14473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882750.15027: done with get_vars() 28173 1726882750.15035: done getting variables 28173 1726882750.15081: in VariableManager get_vars() 28173 1726882750.15089: Calling all_inventory to load vars for managed_node2 28173 1726882750.15092: Calling groups_inventory to load vars for managed_node2 28173 1726882750.15094: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882750.15097: Calling all_plugins_play to load vars for managed_node2 28173 1726882750.15100: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882750.15102: Calling groups_plugins_play to load vars for managed_node2 28173 1726882750.16472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882750.17155: done with get_vars() 28173 1726882750.17172: done queuing things up, now waiting for results queue to drain 28173 1726882750.17174: results queue empty 28173 1726882750.17175: checking for any_errors_fatal 28173 1726882750.17177: done checking for any_errors_fatal 28173 1726882750.17177: checking for max_fail_percentage 28173 1726882750.17178: done checking for max_fail_percentage 28173 1726882750.17179: checking to see if all hosts have failed and the running result is not ok 28173 1726882750.17180: done checking to see if all hosts have failed 28173 1726882750.17181: getting the remaining hosts for this loop 28173 1726882750.17182: done getting the remaining hosts for this loop 28173 1726882750.17184: getting the next task for host managed_node2 28173 1726882750.17187: done getting next task for host managed_node2 28173 1726882750.17188: ^ task is: None 28173 1726882750.17189: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882750.17190: done queuing things up, now waiting for results queue to drain 28173 1726882750.17191: results queue empty 28173 1726882750.17192: checking for any_errors_fatal 28173 1726882750.17192: done checking for any_errors_fatal 28173 1726882750.17193: checking for max_fail_percentage 28173 1726882750.17194: done checking for max_fail_percentage 28173 1726882750.17194: checking to see if all hosts have failed and the running result is not ok 28173 1726882750.17195: done checking to see if all hosts have failed 28173 1726882750.17197: getting the next task for host managed_node2 28173 1726882750.17199: done getting next task for host managed_node2 28173 1726882750.17200: ^ task is: None 28173 1726882750.17201: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882750.17329: in VariableManager get_vars() 28173 1726882750.17403: done with get_vars() 28173 1726882750.17410: in VariableManager get_vars() 28173 1726882750.17426: done with get_vars() 28173 1726882750.17431: variable 'omit' from source: magic vars 28173 1726882750.17462: in VariableManager get_vars() 28173 1726882750.17485: done with get_vars() 28173 1726882750.17510: variable 'omit' from source: magic vars PLAY [Play for testing route table] ******************************************** 28173 1726882750.18019: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28173 1726882750.18099: getting the remaining hosts for this loop 28173 1726882750.18100: done getting the remaining hosts for this loop 28173 1726882750.18103: getting the next task for host managed_node2 28173 1726882750.18105: done getting next task for host managed_node2 28173 1726882750.18107: ^ task is: TASK: Gathering Facts 28173 1726882750.18108: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882750.18110: getting variables 28173 1726882750.18111: in VariableManager get_vars() 28173 1726882750.18128: Calling all_inventory to load vars for managed_node2 28173 1726882750.18130: Calling groups_inventory to load vars for managed_node2 28173 1726882750.18132: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882750.18137: Calling all_plugins_play to load vars for managed_node2 28173 1726882750.18150: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882750.18153: Calling groups_plugins_play to load vars for managed_node2 28173 1726882750.18310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882750.18546: done with get_vars() 28173 1726882750.18553: done getting variables 28173 1726882750.18595: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:3 Friday 20 September 2024 21:39:10 -0400 (0:00:00.105) 0:00:03.350 ****** 28173 1726882750.18634: entering _queue_task() for managed_node2/gather_facts 28173 1726882750.18893: worker is 1 (out of 1 available) 28173 1726882750.18902: exiting _queue_task() for managed_node2/gather_facts 28173 1726882750.18913: done queuing things up, now waiting for results queue to drain 28173 1726882750.18914: waiting for pending results... 28173 1726882750.19605: running TaskExecutor() for managed_node2/TASK: Gathering Facts 28173 1726882750.19767: in run() - task 0e448fcc-3ce9-926c-8928-000000000151 28173 1726882750.19847: variable 'ansible_search_path' from source: unknown 28173 1726882750.19931: calling self._execute() 28173 1726882750.20124: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882750.20222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882750.20246: variable 'omit' from source: magic vars 28173 1726882750.20698: variable 'ansible_distribution_major_version' from source: facts 28173 1726882750.20718: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882750.20731: variable 'omit' from source: magic vars 28173 1726882750.20759: variable 'omit' from source: magic vars 28173 1726882750.20803: variable 'omit' from source: magic vars 28173 1726882750.20847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882750.20890: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882750.20913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882750.20938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882750.20957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882750.20995: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882750.21004: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882750.21012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882750.21120: Set connection var ansible_pipelining to False 28173 1726882750.21128: Set connection var ansible_shell_type to sh 28173 1726882750.21142: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882750.21157: Set connection var ansible_timeout to 10 28173 1726882750.21175: Set connection var ansible_shell_executable to /bin/sh 28173 1726882750.21186: Set connection var ansible_connection to ssh 28173 1726882750.21211: variable 'ansible_shell_executable' from source: unknown 28173 1726882750.21219: variable 'ansible_connection' from source: unknown 28173 1726882750.21226: variable 'ansible_module_compression' from source: unknown 28173 1726882750.21233: variable 'ansible_shell_type' from source: unknown 28173 1726882750.21240: variable 'ansible_shell_executable' from source: unknown 28173 1726882750.21246: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882750.21253: variable 'ansible_pipelining' from source: unknown 28173 1726882750.21260: variable 'ansible_timeout' from source: unknown 28173 1726882750.21275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882750.21545: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882750.21560: variable 'omit' from source: magic vars 28173 1726882750.21574: starting attempt loop 28173 1726882750.21581: running the handler 28173 1726882750.21605: variable 'ansible_facts' from source: unknown 28173 1726882750.21626: _low_level_execute_command(): starting 28173 1726882750.21638: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882750.22818: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882750.22833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882750.22850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882750.22875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882750.22916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882750.22930: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882750.22945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882750.22968: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882750.22982: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882750.22992: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882750.23002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882750.23017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882750.23031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882750.23044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882750.23054: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882750.23075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882750.23152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882750.23184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882750.23203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882750.23344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882750.25662: stdout chunk (state=3): >>>/root <<< 28173 1726882750.25902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882750.25905: stdout chunk (state=3): >>><<< 28173 1726882750.25909: stderr chunk (state=3): >>><<< 28173 1726882750.26026: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882750.26030: _low_level_execute_command(): starting 28173 1726882750.26033: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882750.259303-28367-48016918349702 `" && echo ansible-tmp-1726882750.259303-28367-48016918349702="` echo /root/.ansible/tmp/ansible-tmp-1726882750.259303-28367-48016918349702 `" ) && sleep 0' 28173 1726882750.26702: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882750.26716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882750.26730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882750.26747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882750.26799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882750.26816: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882750.26828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882750.26844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882750.26854: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882750.26868: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882750.26880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882750.26892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882750.26918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882750.26934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882750.26945: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882750.26956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882750.27056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882750.27081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882750.27100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882750.27235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882750.29886: stdout chunk (state=3): >>>ansible-tmp-1726882750.259303-28367-48016918349702=/root/.ansible/tmp/ansible-tmp-1726882750.259303-28367-48016918349702 <<< 28173 1726882750.30125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882750.30128: stdout chunk (state=3): >>><<< 28173 1726882750.30130: stderr chunk (state=3): >>><<< 28173 1726882750.30375: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882750.259303-28367-48016918349702=/root/.ansible/tmp/ansible-tmp-1726882750.259303-28367-48016918349702 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882750.30378: variable 'ansible_module_compression' from source: unknown 28173 1726882750.30380: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28173 1726882750.30382: variable 'ansible_facts' from source: unknown 28173 1726882750.30472: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882750.259303-28367-48016918349702/AnsiballZ_setup.py 28173 1726882750.30654: Sending initial data 28173 1726882750.30657: Sent initial data (152 bytes) 28173 1726882750.31896: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882750.31909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882750.31923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882750.31948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882750.31995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882750.32008: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882750.32025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882750.32042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882750.32061: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882750.32077: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882750.32089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882750.32101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882750.32115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882750.32126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882750.32135: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882750.32147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882750.32234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882750.32255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882750.32285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882750.32425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882750.34918: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882750.35015: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882750.35118: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp9runbdk3 /root/.ansible/tmp/ansible-tmp-1726882750.259303-28367-48016918349702/AnsiballZ_setup.py <<< 28173 1726882750.35219: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882750.37291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882750.37474: stderr chunk (state=3): >>><<< 28173 1726882750.37478: stdout chunk (state=3): >>><<< 28173 1726882750.37575: done transferring module to remote 28173 1726882750.37581: _low_level_execute_command(): starting 28173 1726882750.37584: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882750.259303-28367-48016918349702/ /root/.ansible/tmp/ansible-tmp-1726882750.259303-28367-48016918349702/AnsiballZ_setup.py && sleep 0' 28173 1726882750.38322: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882750.38325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882750.38328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882750.38330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882750.38333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882750.38393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882750.38396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882750.38505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882750.41249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882750.41252: stdout chunk (state=3): >>><<< 28173 1726882750.41255: stderr chunk (state=3): >>><<< 28173 1726882750.41358: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882750.41361: _low_level_execute_command(): starting 28173 1726882750.41369: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882750.259303-28367-48016918349702/AnsiballZ_setup.py && sleep 0' 28173 1726882750.41862: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882750.41870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882750.41872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882750.41902: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882750.41904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882750.41907: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882750.41909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882750.41910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882750.41969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882750.41975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882750.42089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882751.10637: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "10", "epoch": "1726882750", "epoch_int": "1726882750", "date": "2024-09-20", "time": "21:39:10", "iso8601_micro": "2024-09-21<<< 28173 1726882751.10657: stdout chunk (state=3): >>>T01:39:10.774812Z", "iso8601": "2024-09-21T01:39:10Z", "iso8601_basic": "20240920T213910774812", "iso8601_basic_short": "20240920T213910", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansib<<< 28173 1726882751.10726: stdout chunk (state=3): >>>le_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2790, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 742, "free": 2790}, "nocache": {"free": 3254, "used": 278}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 689, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238321664, "block_size": 4096, "block_total": 65519355, "block_available": 64511309, "block_used": 1008046, "inode_total": 131071472, "inode_available": 130998691, "inode_used": 72781, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_loadavg": {"1m": 0.43, "5m": 0.41, "15m": 0.24}, "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed<<< 28173 1726882751.10734: stdout chunk (state=3): >>>]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28173 1726882751.13021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882751.13025: stdout chunk (state=3): >>><<< 28173 1726882751.13027: stderr chunk (state=3): >>><<< 28173 1726882751.13072: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "10", "epoch": "1726882750", "epoch_int": "1726882750", "date": "2024-09-20", "time": "21:39:10", "iso8601_micro": "2024-09-21T01:39:10.774812Z", "iso8601": "2024-09-21T01:39:10Z", "iso8601_basic": "20240920T213910774812", "iso8601_basic_short": "20240920T213910", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2790, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 742, "free": 2790}, "nocache": {"free": 3254, "used": 278}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 689, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238321664, "block_size": 4096, "block_total": 65519355, "block_available": 64511309, "block_used": 1008046, "inode_total": 131071472, "inode_available": 130998691, "inode_used": 72781, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_loadavg": {"1m": 0.43, "5m": 0.41, "15m": 0.24}, "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882751.13503: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882750.259303-28367-48016918349702/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882751.13545: _low_level_execute_command(): starting 28173 1726882751.13562: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882750.259303-28367-48016918349702/ > /dev/null 2>&1 && sleep 0' 28173 1726882751.14177: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882751.14180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.14213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.14216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.14218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.14275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882751.14290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882751.14391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882751.17056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882751.17059: stdout chunk (state=3): >>><<< 28173 1726882751.17062: stderr chunk (state=3): >>><<< 28173 1726882751.17082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882751.17087: handler run complete 28173 1726882751.17191: variable 'ansible_facts' from source: unknown 28173 1726882751.17254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.17468: variable 'ansible_facts' from source: unknown 28173 1726882751.17523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.17605: attempt loop complete, returning result 28173 1726882751.17608: _execute() done 28173 1726882751.17611: dumping result to json 28173 1726882751.17634: done dumping result, returning 28173 1726882751.17641: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-926c-8928-000000000151] 28173 1726882751.17650: sending task result for task 0e448fcc-3ce9-926c-8928-000000000151 ok: [managed_node2] 28173 1726882751.18150: no more pending results, returning what we have 28173 1726882751.18152: results queue empty 28173 1726882751.18153: checking for any_errors_fatal 28173 1726882751.18154: done checking for any_errors_fatal 28173 1726882751.18154: checking for max_fail_percentage 28173 1726882751.18155: done checking for max_fail_percentage 28173 1726882751.18156: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.18156: done checking to see if all hosts have failed 28173 1726882751.18157: getting the remaining hosts for this loop 28173 1726882751.18157: done getting the remaining hosts for this loop 28173 1726882751.18160: getting the next task for host managed_node2 28173 1726882751.18166: done getting next task for host managed_node2 28173 1726882751.18168: ^ task is: TASK: meta (flush_handlers) 28173 1726882751.18169: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.18171: getting variables 28173 1726882751.18172: in VariableManager get_vars() 28173 1726882751.18196: Calling all_inventory to load vars for managed_node2 28173 1726882751.18197: Calling groups_inventory to load vars for managed_node2 28173 1726882751.18199: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.18207: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.18209: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.18211: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.18312: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000151 28173 1726882751.18315: WORKER PROCESS EXITING 28173 1726882751.18324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.18447: done with get_vars() 28173 1726882751.18455: done getting variables 28173 1726882751.18505: in VariableManager get_vars() 28173 1726882751.18516: Calling all_inventory to load vars for managed_node2 28173 1726882751.18518: Calling groups_inventory to load vars for managed_node2 28173 1726882751.18520: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.18523: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.18525: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.18526: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.18612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.18745: done with get_vars() 28173 1726882751.18754: done queuing things up, now waiting for results queue to drain 28173 1726882751.18755: results queue empty 28173 1726882751.18755: checking for any_errors_fatal 28173 1726882751.18757: done checking for any_errors_fatal 28173 1726882751.18758: checking for max_fail_percentage 28173 1726882751.18758: done checking for max_fail_percentage 28173 1726882751.18759: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.18762: done checking to see if all hosts have failed 28173 1726882751.18762: getting the remaining hosts for this loop 28173 1726882751.18763: done getting the remaining hosts for this loop 28173 1726882751.18768: getting the next task for host managed_node2 28173 1726882751.18771: done getting next task for host managed_node2 28173 1726882751.18772: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 28173 1726882751.18773: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.18774: getting variables 28173 1726882751.18775: in VariableManager get_vars() 28173 1726882751.18783: Calling all_inventory to load vars for managed_node2 28173 1726882751.18784: Calling groups_inventory to load vars for managed_node2 28173 1726882751.18785: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.18788: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.18790: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.18791: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.18904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.19089: done with get_vars() 28173 1726882751.19095: done getting variables 28173 1726882751.19121: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882751.19263: variable 'type' from source: play vars 28173 1726882751.19270: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:11 Friday 20 September 2024 21:39:11 -0400 (0:00:01.006) 0:00:04.357 ****** 28173 1726882751.19299: entering _queue_task() for managed_node2/set_fact 28173 1726882751.19528: worker is 1 (out of 1 available) 28173 1726882751.19538: exiting _queue_task() for managed_node2/set_fact 28173 1726882751.19549: done queuing things up, now waiting for results queue to drain 28173 1726882751.19550: waiting for pending results... 28173 1726882751.19803: running TaskExecutor() for managed_node2/TASK: Set type=veth and interface=ethtest0 28173 1726882751.19900: in run() - task 0e448fcc-3ce9-926c-8928-00000000000b 28173 1726882751.19922: variable 'ansible_search_path' from source: unknown 28173 1726882751.19969: calling self._execute() 28173 1726882751.20053: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.20062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.20078: variable 'omit' from source: magic vars 28173 1726882751.20393: variable 'ansible_distribution_major_version' from source: facts 28173 1726882751.20403: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882751.20409: variable 'omit' from source: magic vars 28173 1726882751.20443: variable 'omit' from source: magic vars 28173 1726882751.20467: variable 'type' from source: play vars 28173 1726882751.20516: variable 'type' from source: play vars 28173 1726882751.20522: variable 'interface' from source: play vars 28173 1726882751.20622: variable 'interface' from source: play vars 28173 1726882751.20652: variable 'omit' from source: magic vars 28173 1726882751.20706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882751.20768: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882751.20796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882751.20817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882751.20833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882751.20879: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882751.20888: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.20896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.21007: Set connection var ansible_pipelining to False 28173 1726882751.21014: Set connection var ansible_shell_type to sh 28173 1726882751.21026: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882751.21037: Set connection var ansible_timeout to 10 28173 1726882751.21046: Set connection var ansible_shell_executable to /bin/sh 28173 1726882751.21054: Set connection var ansible_connection to ssh 28173 1726882751.21095: variable 'ansible_shell_executable' from source: unknown 28173 1726882751.21102: variable 'ansible_connection' from source: unknown 28173 1726882751.21108: variable 'ansible_module_compression' from source: unknown 28173 1726882751.21114: variable 'ansible_shell_type' from source: unknown 28173 1726882751.21120: variable 'ansible_shell_executable' from source: unknown 28173 1726882751.21126: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.21133: variable 'ansible_pipelining' from source: unknown 28173 1726882751.21139: variable 'ansible_timeout' from source: unknown 28173 1726882751.21145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.21302: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882751.21328: variable 'omit' from source: magic vars 28173 1726882751.21337: starting attempt loop 28173 1726882751.21344: running the handler 28173 1726882751.21360: handler run complete 28173 1726882751.21379: attempt loop complete, returning result 28173 1726882751.21395: _execute() done 28173 1726882751.21408: dumping result to json 28173 1726882751.21433: done dumping result, returning 28173 1726882751.21451: done running TaskExecutor() for managed_node2/TASK: Set type=veth and interface=ethtest0 [0e448fcc-3ce9-926c-8928-00000000000b] 28173 1726882751.21476: sending task result for task 0e448fcc-3ce9-926c-8928-00000000000b 28173 1726882751.21627: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000000b ok: [managed_node2] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 28173 1726882751.21707: no more pending results, returning what we have 28173 1726882751.21709: results queue empty 28173 1726882751.21710: checking for any_errors_fatal 28173 1726882751.21712: done checking for any_errors_fatal 28173 1726882751.21713: checking for max_fail_percentage 28173 1726882751.21714: done checking for max_fail_percentage 28173 1726882751.21715: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.21715: done checking to see if all hosts have failed 28173 1726882751.21716: getting the remaining hosts for this loop 28173 1726882751.21718: done getting the remaining hosts for this loop 28173 1726882751.21721: getting the next task for host managed_node2 28173 1726882751.21727: done getting next task for host managed_node2 28173 1726882751.21729: ^ task is: TASK: Include the task 'show_interfaces.yml' 28173 1726882751.21731: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.21735: getting variables 28173 1726882751.21736: in VariableManager get_vars() 28173 1726882751.21785: Calling all_inventory to load vars for managed_node2 28173 1726882751.21788: Calling groups_inventory to load vars for managed_node2 28173 1726882751.21790: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.21801: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.21804: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.21807: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.22083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.22324: done with get_vars() 28173 1726882751.22334: done getting variables 28173 1726882751.22427: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:15 Friday 20 September 2024 21:39:11 -0400 (0:00:00.031) 0:00:04.389 ****** 28173 1726882751.22497: entering _queue_task() for managed_node2/include_tasks 28173 1726882751.22886: worker is 1 (out of 1 available) 28173 1726882751.22896: exiting _queue_task() for managed_node2/include_tasks 28173 1726882751.22906: done queuing things up, now waiting for results queue to drain 28173 1726882751.22907: waiting for pending results... 28173 1726882751.23562: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 28173 1726882751.23694: in run() - task 0e448fcc-3ce9-926c-8928-00000000000c 28173 1726882751.23713: variable 'ansible_search_path' from source: unknown 28173 1726882751.23772: calling self._execute() 28173 1726882751.23870: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.23889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.23912: variable 'omit' from source: magic vars 28173 1726882751.24416: variable 'ansible_distribution_major_version' from source: facts 28173 1726882751.24559: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882751.24574: _execute() done 28173 1726882751.24581: dumping result to json 28173 1726882751.24588: done dumping result, returning 28173 1726882751.24597: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-926c-8928-00000000000c] 28173 1726882751.24606: sending task result for task 0e448fcc-3ce9-926c-8928-00000000000c 28173 1726882751.24727: no more pending results, returning what we have 28173 1726882751.24733: in VariableManager get_vars() 28173 1726882751.24779: Calling all_inventory to load vars for managed_node2 28173 1726882751.24783: Calling groups_inventory to load vars for managed_node2 28173 1726882751.24785: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.24797: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.24801: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.24804: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.25065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.25343: done with get_vars() 28173 1726882751.25351: variable 'ansible_search_path' from source: unknown 28173 1726882751.25367: we have included files to process 28173 1726882751.25368: generating all_blocks data 28173 1726882751.25370: done generating all_blocks data 28173 1726882751.25371: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28173 1726882751.25372: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28173 1726882751.25374: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28173 1726882751.25817: in VariableManager get_vars() 28173 1726882751.25837: done with get_vars() 28173 1726882751.25879: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000000c 28173 1726882751.25883: WORKER PROCESS EXITING 28173 1726882751.25975: done processing included file 28173 1726882751.25977: iterating over new_blocks loaded from include file 28173 1726882751.25979: in VariableManager get_vars() 28173 1726882751.25995: done with get_vars() 28173 1726882751.25997: filtering new block on tags 28173 1726882751.26014: done filtering new block on tags 28173 1726882751.26016: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 28173 1726882751.26020: extending task lists for all hosts with included blocks 28173 1726882751.27432: done extending task lists 28173 1726882751.27433: done processing included files 28173 1726882751.27433: results queue empty 28173 1726882751.27434: checking for any_errors_fatal 28173 1726882751.27436: done checking for any_errors_fatal 28173 1726882751.27436: checking for max_fail_percentage 28173 1726882751.27437: done checking for max_fail_percentage 28173 1726882751.27437: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.27438: done checking to see if all hosts have failed 28173 1726882751.27438: getting the remaining hosts for this loop 28173 1726882751.27440: done getting the remaining hosts for this loop 28173 1726882751.27441: getting the next task for host managed_node2 28173 1726882751.27443: done getting next task for host managed_node2 28173 1726882751.27445: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28173 1726882751.27447: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.27448: getting variables 28173 1726882751.27449: in VariableManager get_vars() 28173 1726882751.27457: Calling all_inventory to load vars for managed_node2 28173 1726882751.27459: Calling groups_inventory to load vars for managed_node2 28173 1726882751.27460: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.27467: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.27469: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.27471: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.27575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.27716: done with get_vars() 28173 1726882751.27723: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:39:11 -0400 (0:00:00.052) 0:00:04.442 ****** 28173 1726882751.27789: entering _queue_task() for managed_node2/include_tasks 28173 1726882751.28000: worker is 1 (out of 1 available) 28173 1726882751.28012: exiting _queue_task() for managed_node2/include_tasks 28173 1726882751.28023: done queuing things up, now waiting for results queue to drain 28173 1726882751.28024: waiting for pending results... 28173 1726882751.28182: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 28173 1726882751.28246: in run() - task 0e448fcc-3ce9-926c-8928-000000000169 28173 1726882751.28256: variable 'ansible_search_path' from source: unknown 28173 1726882751.28263: variable 'ansible_search_path' from source: unknown 28173 1726882751.28310: calling self._execute() 28173 1726882751.28377: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.28381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.28405: variable 'omit' from source: magic vars 28173 1726882751.28705: variable 'ansible_distribution_major_version' from source: facts 28173 1726882751.28721: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882751.28731: _execute() done 28173 1726882751.28738: dumping result to json 28173 1726882751.28744: done dumping result, returning 28173 1726882751.28753: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-926c-8928-000000000169] 28173 1726882751.28767: sending task result for task 0e448fcc-3ce9-926c-8928-000000000169 28173 1726882751.28862: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000169 28173 1726882751.28898: no more pending results, returning what we have 28173 1726882751.28903: in VariableManager get_vars() 28173 1726882751.28943: Calling all_inventory to load vars for managed_node2 28173 1726882751.28946: Calling groups_inventory to load vars for managed_node2 28173 1726882751.28948: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.28959: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.28962: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.28966: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.29156: WORKER PROCESS EXITING 28173 1726882751.29181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.29405: done with get_vars() 28173 1726882751.29421: variable 'ansible_search_path' from source: unknown 28173 1726882751.29422: variable 'ansible_search_path' from source: unknown 28173 1726882751.29456: we have included files to process 28173 1726882751.29457: generating all_blocks data 28173 1726882751.29458: done generating all_blocks data 28173 1726882751.29459: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28173 1726882751.29460: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28173 1726882751.29462: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28173 1726882751.29817: done processing included file 28173 1726882751.29818: iterating over new_blocks loaded from include file 28173 1726882751.29819: in VariableManager get_vars() 28173 1726882751.29830: done with get_vars() 28173 1726882751.29831: filtering new block on tags 28173 1726882751.29841: done filtering new block on tags 28173 1726882751.29842: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 28173 1726882751.29845: extending task lists for all hosts with included blocks 28173 1726882751.29911: done extending task lists 28173 1726882751.29912: done processing included files 28173 1726882751.29912: results queue empty 28173 1726882751.29913: checking for any_errors_fatal 28173 1726882751.29914: done checking for any_errors_fatal 28173 1726882751.29915: checking for max_fail_percentage 28173 1726882751.29916: done checking for max_fail_percentage 28173 1726882751.29916: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.29917: done checking to see if all hosts have failed 28173 1726882751.29917: getting the remaining hosts for this loop 28173 1726882751.29918: done getting the remaining hosts for this loop 28173 1726882751.29919: getting the next task for host managed_node2 28173 1726882751.29922: done getting next task for host managed_node2 28173 1726882751.29923: ^ task is: TASK: Gather current interface info 28173 1726882751.29925: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.29926: getting variables 28173 1726882751.29927: in VariableManager get_vars() 28173 1726882751.29935: Calling all_inventory to load vars for managed_node2 28173 1726882751.29936: Calling groups_inventory to load vars for managed_node2 28173 1726882751.29937: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.29942: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.29943: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.29945: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.30035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.30151: done with get_vars() 28173 1726882751.30157: done getting variables 28173 1726882751.30189: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:39:11 -0400 (0:00:00.024) 0:00:04.466 ****** 28173 1726882751.30208: entering _queue_task() for managed_node2/command 28173 1726882751.30361: worker is 1 (out of 1 available) 28173 1726882751.30375: exiting _queue_task() for managed_node2/command 28173 1726882751.30386: done queuing things up, now waiting for results queue to drain 28173 1726882751.30387: waiting for pending results... 28173 1726882751.30522: running TaskExecutor() for managed_node2/TASK: Gather current interface info 28173 1726882751.30592: in run() - task 0e448fcc-3ce9-926c-8928-00000000024e 28173 1726882751.30602: variable 'ansible_search_path' from source: unknown 28173 1726882751.30607: variable 'ansible_search_path' from source: unknown 28173 1726882751.30639: calling self._execute() 28173 1726882751.30698: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.30702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.30708: variable 'omit' from source: magic vars 28173 1726882751.30957: variable 'ansible_distribution_major_version' from source: facts 28173 1726882751.30969: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882751.30980: variable 'omit' from source: magic vars 28173 1726882751.31008: variable 'omit' from source: magic vars 28173 1726882751.31031: variable 'omit' from source: magic vars 28173 1726882751.31068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882751.31099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882751.31114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882751.31127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882751.31137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882751.31161: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882751.31171: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.31176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.31240: Set connection var ansible_pipelining to False 28173 1726882751.31244: Set connection var ansible_shell_type to sh 28173 1726882751.31250: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882751.31257: Set connection var ansible_timeout to 10 28173 1726882751.31261: Set connection var ansible_shell_executable to /bin/sh 28173 1726882751.31274: Set connection var ansible_connection to ssh 28173 1726882751.31293: variable 'ansible_shell_executable' from source: unknown 28173 1726882751.31296: variable 'ansible_connection' from source: unknown 28173 1726882751.31299: variable 'ansible_module_compression' from source: unknown 28173 1726882751.31301: variable 'ansible_shell_type' from source: unknown 28173 1726882751.31303: variable 'ansible_shell_executable' from source: unknown 28173 1726882751.31305: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.31307: variable 'ansible_pipelining' from source: unknown 28173 1726882751.31310: variable 'ansible_timeout' from source: unknown 28173 1726882751.31313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.31413: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882751.31421: variable 'omit' from source: magic vars 28173 1726882751.31426: starting attempt loop 28173 1726882751.31429: running the handler 28173 1726882751.31440: _low_level_execute_command(): starting 28173 1726882751.31446: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882751.32089: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882751.32100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882751.32169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.32173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.32176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882751.32178: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882751.32181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.32183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882751.32185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882751.32278: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882751.32281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882751.32284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.32286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.32288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882751.32290: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882751.32292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.32372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882751.32375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882751.32378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882751.32451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882751.34009: stdout chunk (state=3): >>>/root <<< 28173 1726882751.34291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882751.34294: stdout chunk (state=3): >>><<< 28173 1726882751.34296: stderr chunk (state=3): >>><<< 28173 1726882751.34300: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882751.34302: _low_level_execute_command(): starting 28173 1726882751.34305: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882751.342041-28433-200597349138292 `" && echo ansible-tmp-1726882751.342041-28433-200597349138292="` echo /root/.ansible/tmp/ansible-tmp-1726882751.342041-28433-200597349138292 `" ) && sleep 0' 28173 1726882751.34896: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882751.34913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882751.34928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.34954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.34996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882751.35008: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882751.35023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.35044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882751.35068: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882751.35083: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882751.35097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882751.35112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.35129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.35143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882751.35166: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882751.35183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.35256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882751.35286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882751.35303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882751.35432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882751.37301: stdout chunk (state=3): >>>ansible-tmp-1726882751.342041-28433-200597349138292=/root/.ansible/tmp/ansible-tmp-1726882751.342041-28433-200597349138292 <<< 28173 1726882751.37414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882751.37503: stderr chunk (state=3): >>><<< 28173 1726882751.37515: stdout chunk (state=3): >>><<< 28173 1726882751.37775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882751.342041-28433-200597349138292=/root/.ansible/tmp/ansible-tmp-1726882751.342041-28433-200597349138292 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882751.37779: variable 'ansible_module_compression' from source: unknown 28173 1726882751.37781: ANSIBALLZ: Using generic lock for ansible.legacy.command 28173 1726882751.37783: ANSIBALLZ: Acquiring lock 28173 1726882751.37785: ANSIBALLZ: Lock acquired: 140243978110592 28173 1726882751.37787: ANSIBALLZ: Creating module 28173 1726882751.49686: ANSIBALLZ: Writing module into payload 28173 1726882751.49801: ANSIBALLZ: Writing module 28173 1726882751.49831: ANSIBALLZ: Renaming module 28173 1726882751.49842: ANSIBALLZ: Done creating module 28173 1726882751.49863: variable 'ansible_facts' from source: unknown 28173 1726882751.49943: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882751.342041-28433-200597349138292/AnsiballZ_command.py 28173 1726882751.50102: Sending initial data 28173 1726882751.50105: Sent initial data (155 bytes) 28173 1726882751.51049: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882751.51061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882751.51080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.51096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.51135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882751.51144: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882751.51155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.51177: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882751.51187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882751.51196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882751.51206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882751.51220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.51235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.51247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882751.51257: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882751.51276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.51347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882751.51372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882751.51387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882751.51517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882751.53369: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882751.53461: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882751.53561: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpjfkwu171 /root/.ansible/tmp/ansible-tmp-1726882751.342041-28433-200597349138292/AnsiballZ_command.py <<< 28173 1726882751.53657: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882751.54925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882751.55072: stderr chunk (state=3): >>><<< 28173 1726882751.55076: stdout chunk (state=3): >>><<< 28173 1726882751.55172: done transferring module to remote 28173 1726882751.55175: _low_level_execute_command(): starting 28173 1726882751.55178: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882751.342041-28433-200597349138292/ /root/.ansible/tmp/ansible-tmp-1726882751.342041-28433-200597349138292/AnsiballZ_command.py && sleep 0' 28173 1726882751.55735: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882751.55749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882751.55768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.55787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.55826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882751.55838: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882751.55852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.55871: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882751.55887: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882751.55899: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882751.55913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882751.55926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.55942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.55955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882751.55969: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882751.55983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.56053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882751.56079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882751.56095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882751.56250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882751.57999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882751.58039: stderr chunk (state=3): >>><<< 28173 1726882751.58042: stdout chunk (state=3): >>><<< 28173 1726882751.58055: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882751.58058: _low_level_execute_command(): starting 28173 1726882751.58067: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882751.342041-28433-200597349138292/AnsiballZ_command.py && sleep 0' 28173 1726882751.58497: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.58503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.58532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882751.58536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.58538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.58592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882751.58595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882751.58705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882751.72141: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:11.716192", "end": "2024-09-20 21:39:11.719446", "delta": "0:00:00.003254", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882751.73303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882751.73353: stderr chunk (state=3): >>><<< 28173 1726882751.73356: stdout chunk (state=3): >>><<< 28173 1726882751.73373: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:11.716192", "end": "2024-09-20 21:39:11.719446", "delta": "0:00:00.003254", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882751.73403: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882751.342041-28433-200597349138292/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882751.73409: _low_level_execute_command(): starting 28173 1726882751.73416: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882751.342041-28433-200597349138292/ > /dev/null 2>&1 && sleep 0' 28173 1726882751.73851: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882751.73854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.73890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882751.73893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28173 1726882751.73896: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.73898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.73948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882751.73951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882751.74054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882751.75861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882751.75929: stderr chunk (state=3): >>><<< 28173 1726882751.75932: stdout chunk (state=3): >>><<< 28173 1726882751.75945: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882751.75954: handler run complete 28173 1726882751.75986: Evaluated conditional (False): False 28173 1726882751.75999: attempt loop complete, returning result 28173 1726882751.76009: _execute() done 28173 1726882751.76014: dumping result to json 28173 1726882751.76021: done dumping result, returning 28173 1726882751.76031: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0e448fcc-3ce9-926c-8928-00000000024e] 28173 1726882751.76038: sending task result for task 0e448fcc-3ce9-926c-8928-00000000024e ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003254", "end": "2024-09-20 21:39:11.719446", "rc": 0, "start": "2024-09-20 21:39:11.716192" } STDOUT: bonding_masters eth0 lo rpltstbr 28173 1726882751.76218: no more pending results, returning what we have 28173 1726882751.76222: results queue empty 28173 1726882751.76222: checking for any_errors_fatal 28173 1726882751.76224: done checking for any_errors_fatal 28173 1726882751.76224: checking for max_fail_percentage 28173 1726882751.76225: done checking for max_fail_percentage 28173 1726882751.76226: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.76227: done checking to see if all hosts have failed 28173 1726882751.76227: getting the remaining hosts for this loop 28173 1726882751.76229: done getting the remaining hosts for this loop 28173 1726882751.76232: getting the next task for host managed_node2 28173 1726882751.76238: done getting next task for host managed_node2 28173 1726882751.76240: ^ task is: TASK: Set current_interfaces 28173 1726882751.76244: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.76247: getting variables 28173 1726882751.76249: in VariableManager get_vars() 28173 1726882751.76293: Calling all_inventory to load vars for managed_node2 28173 1726882751.76296: Calling groups_inventory to load vars for managed_node2 28173 1726882751.76298: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.76309: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.76312: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.76315: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.76509: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000024e 28173 1726882751.76513: WORKER PROCESS EXITING 28173 1726882751.76527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.76727: done with get_vars() 28173 1726882751.76737: done getting variables 28173 1726882751.76793: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:39:11 -0400 (0:00:00.466) 0:00:04.932 ****** 28173 1726882751.76825: entering _queue_task() for managed_node2/set_fact 28173 1726882751.77047: worker is 1 (out of 1 available) 28173 1726882751.77057: exiting _queue_task() for managed_node2/set_fact 28173 1726882751.77071: done queuing things up, now waiting for results queue to drain 28173 1726882751.77072: waiting for pending results... 28173 1726882751.77303: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 28173 1726882751.77405: in run() - task 0e448fcc-3ce9-926c-8928-00000000024f 28173 1726882751.77425: variable 'ansible_search_path' from source: unknown 28173 1726882751.77431: variable 'ansible_search_path' from source: unknown 28173 1726882751.77472: calling self._execute() 28173 1726882751.77555: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.77567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.77580: variable 'omit' from source: magic vars 28173 1726882751.77912: variable 'ansible_distribution_major_version' from source: facts 28173 1726882751.77927: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882751.77936: variable 'omit' from source: magic vars 28173 1726882751.77983: variable 'omit' from source: magic vars 28173 1726882751.78086: variable '_current_interfaces' from source: set_fact 28173 1726882751.78145: variable 'omit' from source: magic vars 28173 1726882751.78190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882751.78224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882751.78245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882751.78263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882751.78286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882751.78311: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882751.78314: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.78318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.78386: Set connection var ansible_pipelining to False 28173 1726882751.78389: Set connection var ansible_shell_type to sh 28173 1726882751.78395: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882751.78403: Set connection var ansible_timeout to 10 28173 1726882751.78408: Set connection var ansible_shell_executable to /bin/sh 28173 1726882751.78412: Set connection var ansible_connection to ssh 28173 1726882751.78429: variable 'ansible_shell_executable' from source: unknown 28173 1726882751.78432: variable 'ansible_connection' from source: unknown 28173 1726882751.78434: variable 'ansible_module_compression' from source: unknown 28173 1726882751.78438: variable 'ansible_shell_type' from source: unknown 28173 1726882751.78440: variable 'ansible_shell_executable' from source: unknown 28173 1726882751.78442: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.78444: variable 'ansible_pipelining' from source: unknown 28173 1726882751.78446: variable 'ansible_timeout' from source: unknown 28173 1726882751.78448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.78545: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882751.78553: variable 'omit' from source: magic vars 28173 1726882751.78559: starting attempt loop 28173 1726882751.78562: running the handler 28173 1726882751.78575: handler run complete 28173 1726882751.78583: attempt loop complete, returning result 28173 1726882751.78585: _execute() done 28173 1726882751.78588: dumping result to json 28173 1726882751.78590: done dumping result, returning 28173 1726882751.78597: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0e448fcc-3ce9-926c-8928-00000000024f] 28173 1726882751.78602: sending task result for task 0e448fcc-3ce9-926c-8928-00000000024f 28173 1726882751.78678: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000024f 28173 1726882751.78681: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 28173 1726882751.78769: no more pending results, returning what we have 28173 1726882751.78772: results queue empty 28173 1726882751.78772: checking for any_errors_fatal 28173 1726882751.78779: done checking for any_errors_fatal 28173 1726882751.78780: checking for max_fail_percentage 28173 1726882751.78781: done checking for max_fail_percentage 28173 1726882751.78782: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.78783: done checking to see if all hosts have failed 28173 1726882751.78783: getting the remaining hosts for this loop 28173 1726882751.78784: done getting the remaining hosts for this loop 28173 1726882751.78788: getting the next task for host managed_node2 28173 1726882751.78796: done getting next task for host managed_node2 28173 1726882751.78798: ^ task is: TASK: Show current_interfaces 28173 1726882751.78799: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.78802: getting variables 28173 1726882751.78802: in VariableManager get_vars() 28173 1726882751.78827: Calling all_inventory to load vars for managed_node2 28173 1726882751.78828: Calling groups_inventory to load vars for managed_node2 28173 1726882751.78830: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.78836: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.78838: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.78839: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.78949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.79076: done with get_vars() 28173 1726882751.79083: done getting variables 28173 1726882751.79147: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:39:11 -0400 (0:00:00.023) 0:00:04.956 ****** 28173 1726882751.79168: entering _queue_task() for managed_node2/debug 28173 1726882751.79169: Creating lock for debug 28173 1726882751.79331: worker is 1 (out of 1 available) 28173 1726882751.79342: exiting _queue_task() for managed_node2/debug 28173 1726882751.79353: done queuing things up, now waiting for results queue to drain 28173 1726882751.79354: waiting for pending results... 28173 1726882751.79493: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 28173 1726882751.79544: in run() - task 0e448fcc-3ce9-926c-8928-00000000016a 28173 1726882751.79556: variable 'ansible_search_path' from source: unknown 28173 1726882751.79567: variable 'ansible_search_path' from source: unknown 28173 1726882751.79595: calling self._execute() 28173 1726882751.79655: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.79659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.79674: variable 'omit' from source: magic vars 28173 1726882751.80175: variable 'ansible_distribution_major_version' from source: facts 28173 1726882751.80184: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882751.80190: variable 'omit' from source: magic vars 28173 1726882751.80217: variable 'omit' from source: magic vars 28173 1726882751.80280: variable 'current_interfaces' from source: set_fact 28173 1726882751.80299: variable 'omit' from source: magic vars 28173 1726882751.80329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882751.80353: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882751.80368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882751.80384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882751.80393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882751.80412: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882751.80415: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.80418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.80489: Set connection var ansible_pipelining to False 28173 1726882751.80492: Set connection var ansible_shell_type to sh 28173 1726882751.80498: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882751.80505: Set connection var ansible_timeout to 10 28173 1726882751.80510: Set connection var ansible_shell_executable to /bin/sh 28173 1726882751.80514: Set connection var ansible_connection to ssh 28173 1726882751.80529: variable 'ansible_shell_executable' from source: unknown 28173 1726882751.80537: variable 'ansible_connection' from source: unknown 28173 1726882751.80539: variable 'ansible_module_compression' from source: unknown 28173 1726882751.80542: variable 'ansible_shell_type' from source: unknown 28173 1726882751.80544: variable 'ansible_shell_executable' from source: unknown 28173 1726882751.80546: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.80550: variable 'ansible_pipelining' from source: unknown 28173 1726882751.80552: variable 'ansible_timeout' from source: unknown 28173 1726882751.80556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.80645: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882751.80656: variable 'omit' from source: magic vars 28173 1726882751.80661: starting attempt loop 28173 1726882751.80664: running the handler 28173 1726882751.80706: handler run complete 28173 1726882751.80716: attempt loop complete, returning result 28173 1726882751.80719: _execute() done 28173 1726882751.80721: dumping result to json 28173 1726882751.80723: done dumping result, returning 28173 1726882751.80729: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0e448fcc-3ce9-926c-8928-00000000016a] 28173 1726882751.80734: sending task result for task 0e448fcc-3ce9-926c-8928-00000000016a 28173 1726882751.80816: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000016a 28173 1726882751.80819: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 28173 1726882751.80870: no more pending results, returning what we have 28173 1726882751.80873: results queue empty 28173 1726882751.80874: checking for any_errors_fatal 28173 1726882751.80876: done checking for any_errors_fatal 28173 1726882751.80877: checking for max_fail_percentage 28173 1726882751.80878: done checking for max_fail_percentage 28173 1726882751.80879: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.80880: done checking to see if all hosts have failed 28173 1726882751.80880: getting the remaining hosts for this loop 28173 1726882751.80881: done getting the remaining hosts for this loop 28173 1726882751.80885: getting the next task for host managed_node2 28173 1726882751.80890: done getting next task for host managed_node2 28173 1726882751.80893: ^ task is: TASK: Include the task 'manage_test_interface.yml' 28173 1726882751.80894: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.80897: getting variables 28173 1726882751.80898: in VariableManager get_vars() 28173 1726882751.80926: Calling all_inventory to load vars for managed_node2 28173 1726882751.80928: Calling groups_inventory to load vars for managed_node2 28173 1726882751.80930: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.80936: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.80938: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.80939: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.81226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.81344: done with get_vars() 28173 1726882751.81350: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:17 Friday 20 September 2024 21:39:11 -0400 (0:00:00.022) 0:00:04.978 ****** 28173 1726882751.81409: entering _queue_task() for managed_node2/include_tasks 28173 1726882751.81554: worker is 1 (out of 1 available) 28173 1726882751.81566: exiting _queue_task() for managed_node2/include_tasks 28173 1726882751.81577: done queuing things up, now waiting for results queue to drain 28173 1726882751.81578: waiting for pending results... 28173 1726882751.81724: running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' 28173 1726882751.81781: in run() - task 0e448fcc-3ce9-926c-8928-00000000000d 28173 1726882751.81791: variable 'ansible_search_path' from source: unknown 28173 1726882751.81819: calling self._execute() 28173 1726882751.81885: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.81889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.81896: variable 'omit' from source: magic vars 28173 1726882751.82148: variable 'ansible_distribution_major_version' from source: facts 28173 1726882751.82158: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882751.82167: _execute() done 28173 1726882751.82170: dumping result to json 28173 1726882751.82175: done dumping result, returning 28173 1726882751.82178: done running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' [0e448fcc-3ce9-926c-8928-00000000000d] 28173 1726882751.82186: sending task result for task 0e448fcc-3ce9-926c-8928-00000000000d 28173 1726882751.82272: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000000d 28173 1726882751.82277: WORKER PROCESS EXITING 28173 1726882751.82307: no more pending results, returning what we have 28173 1726882751.82311: in VariableManager get_vars() 28173 1726882751.82347: Calling all_inventory to load vars for managed_node2 28173 1726882751.82349: Calling groups_inventory to load vars for managed_node2 28173 1726882751.82351: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.82359: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.82361: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.82363: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.82478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.82605: done with get_vars() 28173 1726882751.82612: variable 'ansible_search_path' from source: unknown 28173 1726882751.82622: we have included files to process 28173 1726882751.82623: generating all_blocks data 28173 1726882751.82624: done generating all_blocks data 28173 1726882751.82627: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 28173 1726882751.82627: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 28173 1726882751.82629: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 28173 1726882751.82954: in VariableManager get_vars() 28173 1726882751.82970: done with get_vars() 28173 1726882751.83109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28173 1726882751.83466: done processing included file 28173 1726882751.83468: iterating over new_blocks loaded from include file 28173 1726882751.83468: in VariableManager get_vars() 28173 1726882751.83482: done with get_vars() 28173 1726882751.83483: filtering new block on tags 28173 1726882751.83502: done filtering new block on tags 28173 1726882751.83503: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 28173 1726882751.83514: extending task lists for all hosts with included blocks 28173 1726882751.84665: done extending task lists 28173 1726882751.84666: done processing included files 28173 1726882751.84667: results queue empty 28173 1726882751.84668: checking for any_errors_fatal 28173 1726882751.84670: done checking for any_errors_fatal 28173 1726882751.84671: checking for max_fail_percentage 28173 1726882751.84672: done checking for max_fail_percentage 28173 1726882751.84672: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.84673: done checking to see if all hosts have failed 28173 1726882751.84673: getting the remaining hosts for this loop 28173 1726882751.84675: done getting the remaining hosts for this loop 28173 1726882751.84677: getting the next task for host managed_node2 28173 1726882751.84680: done getting next task for host managed_node2 28173 1726882751.84681: ^ task is: TASK: Ensure state in ["present", "absent"] 28173 1726882751.84683: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.84685: getting variables 28173 1726882751.84685: in VariableManager get_vars() 28173 1726882751.84694: Calling all_inventory to load vars for managed_node2 28173 1726882751.84695: Calling groups_inventory to load vars for managed_node2 28173 1726882751.84696: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.84699: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.84701: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.84703: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.84789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.84904: done with get_vars() 28173 1726882751.84910: done getting variables 28173 1726882751.84950: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:39:11 -0400 (0:00:00.035) 0:00:05.014 ****** 28173 1726882751.84975: entering _queue_task() for managed_node2/fail 28173 1726882751.84979: Creating lock for fail 28173 1726882751.85141: worker is 1 (out of 1 available) 28173 1726882751.85155: exiting _queue_task() for managed_node2/fail 28173 1726882751.85168: done queuing things up, now waiting for results queue to drain 28173 1726882751.85169: waiting for pending results... 28173 1726882751.85314: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 28173 1726882751.85423: in run() - task 0e448fcc-3ce9-926c-8928-00000000026a 28173 1726882751.85435: variable 'ansible_search_path' from source: unknown 28173 1726882751.85438: variable 'ansible_search_path' from source: unknown 28173 1726882751.85471: calling self._execute() 28173 1726882751.85537: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.85540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.85549: variable 'omit' from source: magic vars 28173 1726882751.85804: variable 'ansible_distribution_major_version' from source: facts 28173 1726882751.85813: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882751.85931: variable 'state' from source: include params 28173 1726882751.85935: Evaluated conditional (state not in ["present", "absent"]): False 28173 1726882751.85939: when evaluation is False, skipping this task 28173 1726882751.85942: _execute() done 28173 1726882751.85944: dumping result to json 28173 1726882751.85946: done dumping result, returning 28173 1726882751.85949: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-926c-8928-00000000026a] 28173 1726882751.85956: sending task result for task 0e448fcc-3ce9-926c-8928-00000000026a 28173 1726882751.86037: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000026a 28173 1726882751.86040: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 28173 1726882751.86095: no more pending results, returning what we have 28173 1726882751.86099: results queue empty 28173 1726882751.86099: checking for any_errors_fatal 28173 1726882751.86101: done checking for any_errors_fatal 28173 1726882751.86101: checking for max_fail_percentage 28173 1726882751.86103: done checking for max_fail_percentage 28173 1726882751.86103: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.86104: done checking to see if all hosts have failed 28173 1726882751.86105: getting the remaining hosts for this loop 28173 1726882751.86106: done getting the remaining hosts for this loop 28173 1726882751.86109: getting the next task for host managed_node2 28173 1726882751.86113: done getting next task for host managed_node2 28173 1726882751.86115: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 28173 1726882751.86118: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.86120: getting variables 28173 1726882751.86121: in VariableManager get_vars() 28173 1726882751.86144: Calling all_inventory to load vars for managed_node2 28173 1726882751.86146: Calling groups_inventory to load vars for managed_node2 28173 1726882751.86148: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.86159: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.86161: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.86162: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.86288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.86410: done with get_vars() 28173 1726882751.86417: done getting variables 28173 1726882751.86451: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:39:11 -0400 (0:00:00.014) 0:00:05.029 ****** 28173 1726882751.86471: entering _queue_task() for managed_node2/fail 28173 1726882751.86959: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 28173 1726882751.86977: worker is 1 (out of 1 available) 28173 1726882751.86984: exiting _queue_task() for managed_node2/fail 28173 1726882751.86992: done queuing things up, now waiting for results queue to drain 28173 1726882751.86993: waiting for pending results... 28173 1726882751.87000: in run() - task 0e448fcc-3ce9-926c-8928-00000000026b 28173 1726882751.87008: variable 'ansible_search_path' from source: unknown 28173 1726882751.87015: variable 'ansible_search_path' from source: unknown 28173 1726882751.87057: calling self._execute() 28173 1726882751.87142: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.87155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.87179: variable 'omit' from source: magic vars 28173 1726882751.87512: variable 'ansible_distribution_major_version' from source: facts 28173 1726882751.87528: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882751.87748: variable 'type' from source: set_fact 28173 1726882751.87778: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 28173 1726882751.87791: when evaluation is False, skipping this task 28173 1726882751.87797: _execute() done 28173 1726882751.87802: dumping result to json 28173 1726882751.87808: done dumping result, returning 28173 1726882751.87823: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-926c-8928-00000000026b] 28173 1726882751.87832: sending task result for task 0e448fcc-3ce9-926c-8928-00000000026b 28173 1726882751.87950: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000026b 28173 1726882751.87953: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 28173 1726882751.88032: no more pending results, returning what we have 28173 1726882751.88035: results queue empty 28173 1726882751.88036: checking for any_errors_fatal 28173 1726882751.88041: done checking for any_errors_fatal 28173 1726882751.88041: checking for max_fail_percentage 28173 1726882751.88043: done checking for max_fail_percentage 28173 1726882751.88044: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.88044: done checking to see if all hosts have failed 28173 1726882751.88045: getting the remaining hosts for this loop 28173 1726882751.88046: done getting the remaining hosts for this loop 28173 1726882751.88049: getting the next task for host managed_node2 28173 1726882751.88053: done getting next task for host managed_node2 28173 1726882751.88055: ^ task is: TASK: Include the task 'show_interfaces.yml' 28173 1726882751.88058: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.88061: getting variables 28173 1726882751.88063: in VariableManager get_vars() 28173 1726882751.88091: Calling all_inventory to load vars for managed_node2 28173 1726882751.88093: Calling groups_inventory to load vars for managed_node2 28173 1726882751.88094: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.88100: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.88102: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.88104: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.88215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.88338: done with get_vars() 28173 1726882751.88344: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:39:11 -0400 (0:00:00.019) 0:00:05.048 ****** 28173 1726882751.88405: entering _queue_task() for managed_node2/include_tasks 28173 1726882751.88553: worker is 1 (out of 1 available) 28173 1726882751.88566: exiting _queue_task() for managed_node2/include_tasks 28173 1726882751.88577: done queuing things up, now waiting for results queue to drain 28173 1726882751.88578: waiting for pending results... 28173 1726882751.88711: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 28173 1726882751.88769: in run() - task 0e448fcc-3ce9-926c-8928-00000000026c 28173 1726882751.88783: variable 'ansible_search_path' from source: unknown 28173 1726882751.88787: variable 'ansible_search_path' from source: unknown 28173 1726882751.88815: calling self._execute() 28173 1726882751.88873: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.88877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.88885: variable 'omit' from source: magic vars 28173 1726882751.89156: variable 'ansible_distribution_major_version' from source: facts 28173 1726882751.89166: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882751.89174: _execute() done 28173 1726882751.89177: dumping result to json 28173 1726882751.89180: done dumping result, returning 28173 1726882751.89184: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-926c-8928-00000000026c] 28173 1726882751.89190: sending task result for task 0e448fcc-3ce9-926c-8928-00000000026c 28173 1726882751.89267: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000026c 28173 1726882751.89270: WORKER PROCESS EXITING 28173 1726882751.89298: no more pending results, returning what we have 28173 1726882751.89303: in VariableManager get_vars() 28173 1726882751.89338: Calling all_inventory to load vars for managed_node2 28173 1726882751.89340: Calling groups_inventory to load vars for managed_node2 28173 1726882751.89343: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.89350: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.89352: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.89353: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.89496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.89620: done with get_vars() 28173 1726882751.89625: variable 'ansible_search_path' from source: unknown 28173 1726882751.89626: variable 'ansible_search_path' from source: unknown 28173 1726882751.89648: we have included files to process 28173 1726882751.89649: generating all_blocks data 28173 1726882751.89650: done generating all_blocks data 28173 1726882751.89652: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28173 1726882751.89653: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28173 1726882751.89654: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28173 1726882751.89720: in VariableManager get_vars() 28173 1726882751.89734: done with get_vars() 28173 1726882751.89829: done processing included file 28173 1726882751.89831: iterating over new_blocks loaded from include file 28173 1726882751.89832: in VariableManager get_vars() 28173 1726882751.89849: done with get_vars() 28173 1726882751.89851: filtering new block on tags 28173 1726882751.89867: done filtering new block on tags 28173 1726882751.89870: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 28173 1726882751.89874: extending task lists for all hosts with included blocks 28173 1726882751.90244: done extending task lists 28173 1726882751.90246: done processing included files 28173 1726882751.90246: results queue empty 28173 1726882751.90247: checking for any_errors_fatal 28173 1726882751.90249: done checking for any_errors_fatal 28173 1726882751.90249: checking for max_fail_percentage 28173 1726882751.90250: done checking for max_fail_percentage 28173 1726882751.90251: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.90251: done checking to see if all hosts have failed 28173 1726882751.90252: getting the remaining hosts for this loop 28173 1726882751.90253: done getting the remaining hosts for this loop 28173 1726882751.90255: getting the next task for host managed_node2 28173 1726882751.90258: done getting next task for host managed_node2 28173 1726882751.90260: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28173 1726882751.90262: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.90266: getting variables 28173 1726882751.90267: in VariableManager get_vars() 28173 1726882751.90279: Calling all_inventory to load vars for managed_node2 28173 1726882751.90281: Calling groups_inventory to load vars for managed_node2 28173 1726882751.90284: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.90289: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.90291: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.90294: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.90457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.90665: done with get_vars() 28173 1726882751.90675: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:39:11 -0400 (0:00:00.023) 0:00:05.071 ****** 28173 1726882751.90740: entering _queue_task() for managed_node2/include_tasks 28173 1726882751.90930: worker is 1 (out of 1 available) 28173 1726882751.90942: exiting _queue_task() for managed_node2/include_tasks 28173 1726882751.90953: done queuing things up, now waiting for results queue to drain 28173 1726882751.90954: waiting for pending results... 28173 1726882751.91193: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 28173 1726882751.91295: in run() - task 0e448fcc-3ce9-926c-8928-000000000369 28173 1726882751.91311: variable 'ansible_search_path' from source: unknown 28173 1726882751.91318: variable 'ansible_search_path' from source: unknown 28173 1726882751.91357: calling self._execute() 28173 1726882751.91445: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.91454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.91469: variable 'omit' from source: magic vars 28173 1726882751.91821: variable 'ansible_distribution_major_version' from source: facts 28173 1726882751.91842: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882751.91851: _execute() done 28173 1726882751.91857: dumping result to json 28173 1726882751.91866: done dumping result, returning 28173 1726882751.91878: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-926c-8928-000000000369] 28173 1726882751.91890: sending task result for task 0e448fcc-3ce9-926c-8928-000000000369 28173 1726882751.92005: no more pending results, returning what we have 28173 1726882751.92011: in VariableManager get_vars() 28173 1726882751.92058: Calling all_inventory to load vars for managed_node2 28173 1726882751.92061: Calling groups_inventory to load vars for managed_node2 28173 1726882751.92065: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.92080: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.92083: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.92087: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.92291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.92512: done with get_vars() 28173 1726882751.92520: variable 'ansible_search_path' from source: unknown 28173 1726882751.92521: variable 'ansible_search_path' from source: unknown 28173 1726882751.92582: we have included files to process 28173 1726882751.92584: generating all_blocks data 28173 1726882751.92586: done generating all_blocks data 28173 1726882751.92588: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28173 1726882751.92589: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28173 1726882751.92591: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28173 1726882751.92870: done processing included file 28173 1726882751.92872: iterating over new_blocks loaded from include file 28173 1726882751.92873: in VariableManager get_vars() 28173 1726882751.92897: done with get_vars() 28173 1726882751.92899: filtering new block on tags 28173 1726882751.92916: done filtering new block on tags 28173 1726882751.92918: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 28173 1726882751.92924: extending task lists for all hosts with included blocks 28173 1726882751.93297: done extending task lists 28173 1726882751.93298: done processing included files 28173 1726882751.93299: results queue empty 28173 1726882751.93300: checking for any_errors_fatal 28173 1726882751.93303: done checking for any_errors_fatal 28173 1726882751.93304: checking for max_fail_percentage 28173 1726882751.93305: done checking for max_fail_percentage 28173 1726882751.93306: checking to see if all hosts have failed and the running result is not ok 28173 1726882751.93306: done checking to see if all hosts have failed 28173 1726882751.93307: getting the remaining hosts for this loop 28173 1726882751.93308: done getting the remaining hosts for this loop 28173 1726882751.93311: getting the next task for host managed_node2 28173 1726882751.93315: done getting next task for host managed_node2 28173 1726882751.93317: ^ task is: TASK: Gather current interface info 28173 1726882751.93320: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882751.93322: getting variables 28173 1726882751.93323: in VariableManager get_vars() 28173 1726882751.93335: Calling all_inventory to load vars for managed_node2 28173 1726882751.93337: Calling groups_inventory to load vars for managed_node2 28173 1726882751.93339: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882751.93345: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000369 28173 1726882751.93348: WORKER PROCESS EXITING 28173 1726882751.93352: Calling all_plugins_play to load vars for managed_node2 28173 1726882751.93354: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882751.93357: Calling groups_plugins_play to load vars for managed_node2 28173 1726882751.93537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882751.93747: done with get_vars() 28173 1726882751.93756: done getting variables 28173 1726882751.93796: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:39:11 -0400 (0:00:00.030) 0:00:05.102 ****** 28173 1726882751.93824: entering _queue_task() for managed_node2/command 28173 1726882751.94053: worker is 1 (out of 1 available) 28173 1726882751.94064: exiting _queue_task() for managed_node2/command 28173 1726882751.94076: done queuing things up, now waiting for results queue to drain 28173 1726882751.94077: waiting for pending results... 28173 1726882751.94390: running TaskExecutor() for managed_node2/TASK: Gather current interface info 28173 1726882751.94504: in run() - task 0e448fcc-3ce9-926c-8928-0000000003a0 28173 1726882751.94527: variable 'ansible_search_path' from source: unknown 28173 1726882751.94534: variable 'ansible_search_path' from source: unknown 28173 1726882751.94574: calling self._execute() 28173 1726882751.94664: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.94678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.94691: variable 'omit' from source: magic vars 28173 1726882751.95043: variable 'ansible_distribution_major_version' from source: facts 28173 1726882751.95064: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882751.95077: variable 'omit' from source: magic vars 28173 1726882751.95131: variable 'omit' from source: magic vars 28173 1726882751.95242: variable 'omit' from source: magic vars 28173 1726882751.95290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882751.95327: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882751.95350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882751.95375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882751.95397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882751.95430: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882751.95440: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.95448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.95551: Set connection var ansible_pipelining to False 28173 1726882751.95559: Set connection var ansible_shell_type to sh 28173 1726882751.95576: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882751.95589: Set connection var ansible_timeout to 10 28173 1726882751.95601: Set connection var ansible_shell_executable to /bin/sh 28173 1726882751.95613: Set connection var ansible_connection to ssh 28173 1726882751.95637: variable 'ansible_shell_executable' from source: unknown 28173 1726882751.95646: variable 'ansible_connection' from source: unknown 28173 1726882751.95654: variable 'ansible_module_compression' from source: unknown 28173 1726882751.95660: variable 'ansible_shell_type' from source: unknown 28173 1726882751.95669: variable 'ansible_shell_executable' from source: unknown 28173 1726882751.95678: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882751.95685: variable 'ansible_pipelining' from source: unknown 28173 1726882751.95691: variable 'ansible_timeout' from source: unknown 28173 1726882751.95699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882751.95830: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882751.95843: variable 'omit' from source: magic vars 28173 1726882751.95850: starting attempt loop 28173 1726882751.95855: running the handler 28173 1726882751.95873: _low_level_execute_command(): starting 28173 1726882751.95885: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882751.96996: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882751.97000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.97042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.97046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.97048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.97111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882751.97124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882751.97248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882751.98905: stdout chunk (state=3): >>>/root <<< 28173 1726882751.99090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882751.99093: stdout chunk (state=3): >>><<< 28173 1726882751.99095: stderr chunk (state=3): >>><<< 28173 1726882751.99201: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882751.99207: _low_level_execute_command(): starting 28173 1726882751.99210: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882751.9911914-28465-271544460651792 `" && echo ansible-tmp-1726882751.9911914-28465-271544460651792="` echo /root/.ansible/tmp/ansible-tmp-1726882751.9911914-28465-271544460651792 `" ) && sleep 0' 28173 1726882751.99792: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882751.99806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882751.99821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882751.99839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882751.99892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882751.99911: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882751.99926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882751.99944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882751.99957: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882751.99972: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882751.99993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.00008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.00024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.00037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882752.00049: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882752.00063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.00149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882752.00173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882752.00192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882752.00332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882752.02219: stdout chunk (state=3): >>>ansible-tmp-1726882751.9911914-28465-271544460651792=/root/.ansible/tmp/ansible-tmp-1726882751.9911914-28465-271544460651792 <<< 28173 1726882752.02378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882752.02412: stderr chunk (state=3): >>><<< 28173 1726882752.02415: stdout chunk (state=3): >>><<< 28173 1726882752.02471: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882751.9911914-28465-271544460651792=/root/.ansible/tmp/ansible-tmp-1726882751.9911914-28465-271544460651792 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882752.02475: variable 'ansible_module_compression' from source: unknown 28173 1726882752.02722: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28173 1726882752.02725: variable 'ansible_facts' from source: unknown 28173 1726882752.02727: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882751.9911914-28465-271544460651792/AnsiballZ_command.py 28173 1726882752.02792: Sending initial data 28173 1726882752.02795: Sent initial data (156 bytes) 28173 1726882752.03884: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882752.03897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.03910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.03926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.03967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882752.03985: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882752.03999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.04018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882752.04029: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882752.04039: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882752.04050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.04062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.04082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.04098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882752.04109: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882752.04121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.04198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882752.04224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882752.04238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882752.04370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882752.06130: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882752.06224: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882752.06326: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp7gfxthm2 /root/.ansible/tmp/ansible-tmp-1726882751.9911914-28465-271544460651792/AnsiballZ_command.py <<< 28173 1726882752.06423: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882752.08269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882752.08396: stderr chunk (state=3): >>><<< 28173 1726882752.08399: stdout chunk (state=3): >>><<< 28173 1726882752.08401: done transferring module to remote 28173 1726882752.08403: _low_level_execute_command(): starting 28173 1726882752.08406: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882751.9911914-28465-271544460651792/ /root/.ansible/tmp/ansible-tmp-1726882751.9911914-28465-271544460651792/AnsiballZ_command.py && sleep 0' 28173 1726882752.10121: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882752.10135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.10149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.10172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.10218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882752.10230: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882752.10243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.10259: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882752.10273: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882752.10285: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882752.10298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.10313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.10328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.10339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882752.10348: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882752.10360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.10441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882752.10466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882752.10484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882752.10613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882752.12519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882752.12522: stdout chunk (state=3): >>><<< 28173 1726882752.12524: stderr chunk (state=3): >>><<< 28173 1726882752.12611: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882752.12614: _low_level_execute_command(): starting 28173 1726882752.12617: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882751.9911914-28465-271544460651792/AnsiballZ_command.py && sleep 0' 28173 1726882752.13777: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882752.13791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.13807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.13831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.13880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882752.13894: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882752.13907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.13967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882752.13982: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882752.13997: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882752.14009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.14022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.14041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.14053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882752.14065: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882752.14079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.14159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882752.14184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882752.14200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882752.14336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882752.27855: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:12.273284", "end": "2024-09-20 21:39:12.276606", "delta": "0:00:00.003322", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882752.29146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882752.29150: stdout chunk (state=3): >>><<< 28173 1726882752.29153: stderr chunk (state=3): >>><<< 28173 1726882752.29295: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:12.273284", "end": "2024-09-20 21:39:12.276606", "delta": "0:00:00.003322", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882752.29303: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882751.9911914-28465-271544460651792/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882752.29306: _low_level_execute_command(): starting 28173 1726882752.29309: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882751.9911914-28465-271544460651792/ > /dev/null 2>&1 && sleep 0' 28173 1726882752.30565: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.30570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.30650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.30654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.30656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.30730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882752.30733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882752.30846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882752.32671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882752.32741: stderr chunk (state=3): >>><<< 28173 1726882752.32752: stdout chunk (state=3): >>><<< 28173 1726882752.32873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882752.32878: handler run complete 28173 1726882752.32881: Evaluated conditional (False): False 28173 1726882752.32883: attempt loop complete, returning result 28173 1726882752.32885: _execute() done 28173 1726882752.32887: dumping result to json 28173 1726882752.32889: done dumping result, returning 28173 1726882752.32891: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0e448fcc-3ce9-926c-8928-0000000003a0] 28173 1726882752.32893: sending task result for task 0e448fcc-3ce9-926c-8928-0000000003a0 28173 1726882752.33389: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000003a0 28173 1726882752.33392: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003322", "end": "2024-09-20 21:39:12.276606", "rc": 0, "start": "2024-09-20 21:39:12.273284" } STDOUT: bonding_masters eth0 lo rpltstbr 28173 1726882752.33458: no more pending results, returning what we have 28173 1726882752.33461: results queue empty 28173 1726882752.33462: checking for any_errors_fatal 28173 1726882752.33465: done checking for any_errors_fatal 28173 1726882752.33466: checking for max_fail_percentage 28173 1726882752.33468: done checking for max_fail_percentage 28173 1726882752.33469: checking to see if all hosts have failed and the running result is not ok 28173 1726882752.33470: done checking to see if all hosts have failed 28173 1726882752.33471: getting the remaining hosts for this loop 28173 1726882752.33472: done getting the remaining hosts for this loop 28173 1726882752.33477: getting the next task for host managed_node2 28173 1726882752.33484: done getting next task for host managed_node2 28173 1726882752.33486: ^ task is: TASK: Set current_interfaces 28173 1726882752.33490: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882752.33493: getting variables 28173 1726882752.33495: in VariableManager get_vars() 28173 1726882752.33532: Calling all_inventory to load vars for managed_node2 28173 1726882752.33534: Calling groups_inventory to load vars for managed_node2 28173 1726882752.33536: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882752.33546: Calling all_plugins_play to load vars for managed_node2 28173 1726882752.33549: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882752.33551: Calling groups_plugins_play to load vars for managed_node2 28173 1726882752.33728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882752.33968: done with get_vars() 28173 1726882752.33979: done getting variables 28173 1726882752.34041: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:39:12 -0400 (0:00:00.402) 0:00:05.505 ****** 28173 1726882752.34078: entering _queue_task() for managed_node2/set_fact 28173 1726882752.34314: worker is 1 (out of 1 available) 28173 1726882752.34331: exiting _queue_task() for managed_node2/set_fact 28173 1726882752.34343: done queuing things up, now waiting for results queue to drain 28173 1726882752.34344: waiting for pending results... 28173 1726882752.34593: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 28173 1726882752.34705: in run() - task 0e448fcc-3ce9-926c-8928-0000000003a1 28173 1726882752.34724: variable 'ansible_search_path' from source: unknown 28173 1726882752.34731: variable 'ansible_search_path' from source: unknown 28173 1726882752.34777: calling self._execute() 28173 1726882752.34957: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882752.34973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882752.34987: variable 'omit' from source: magic vars 28173 1726882752.35344: variable 'ansible_distribution_major_version' from source: facts 28173 1726882752.35360: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882752.35374: variable 'omit' from source: magic vars 28173 1726882752.35433: variable 'omit' from source: magic vars 28173 1726882752.35550: variable '_current_interfaces' from source: set_fact 28173 1726882752.35614: variable 'omit' from source: magic vars 28173 1726882752.35662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882752.35700: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882752.35722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882752.35747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882752.35765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882752.35798: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882752.35805: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882752.35812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882752.35914: Set connection var ansible_pipelining to False 28173 1726882752.35922: Set connection var ansible_shell_type to sh 28173 1726882752.35933: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882752.35948: Set connection var ansible_timeout to 10 28173 1726882752.35958: Set connection var ansible_shell_executable to /bin/sh 28173 1726882752.35968: Set connection var ansible_connection to ssh 28173 1726882752.35996: variable 'ansible_shell_executable' from source: unknown 28173 1726882752.36003: variable 'ansible_connection' from source: unknown 28173 1726882752.36009: variable 'ansible_module_compression' from source: unknown 28173 1726882752.36014: variable 'ansible_shell_type' from source: unknown 28173 1726882752.36020: variable 'ansible_shell_executable' from source: unknown 28173 1726882752.36025: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882752.36032: variable 'ansible_pipelining' from source: unknown 28173 1726882752.36037: variable 'ansible_timeout' from source: unknown 28173 1726882752.36044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882752.36187: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882752.36205: variable 'omit' from source: magic vars 28173 1726882752.36214: starting attempt loop 28173 1726882752.36219: running the handler 28173 1726882752.36233: handler run complete 28173 1726882752.36246: attempt loop complete, returning result 28173 1726882752.36251: _execute() done 28173 1726882752.36257: dumping result to json 28173 1726882752.36265: done dumping result, returning 28173 1726882752.36280: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0e448fcc-3ce9-926c-8928-0000000003a1] 28173 1726882752.36290: sending task result for task 0e448fcc-3ce9-926c-8928-0000000003a1 28173 1726882752.36389: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000003a1 28173 1726882752.36396: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 28173 1726882752.36459: no more pending results, returning what we have 28173 1726882752.36462: results queue empty 28173 1726882752.36463: checking for any_errors_fatal 28173 1726882752.36471: done checking for any_errors_fatal 28173 1726882752.36472: checking for max_fail_percentage 28173 1726882752.36473: done checking for max_fail_percentage 28173 1726882752.36474: checking to see if all hosts have failed and the running result is not ok 28173 1726882752.36475: done checking to see if all hosts have failed 28173 1726882752.36476: getting the remaining hosts for this loop 28173 1726882752.36477: done getting the remaining hosts for this loop 28173 1726882752.36481: getting the next task for host managed_node2 28173 1726882752.36491: done getting next task for host managed_node2 28173 1726882752.36493: ^ task is: TASK: Show current_interfaces 28173 1726882752.36497: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882752.36501: getting variables 28173 1726882752.36503: in VariableManager get_vars() 28173 1726882752.36541: Calling all_inventory to load vars for managed_node2 28173 1726882752.36544: Calling groups_inventory to load vars for managed_node2 28173 1726882752.36546: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882752.36557: Calling all_plugins_play to load vars for managed_node2 28173 1726882752.36560: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882752.36565: Calling groups_plugins_play to load vars for managed_node2 28173 1726882752.36792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882752.37009: done with get_vars() 28173 1726882752.37018: done getting variables 28173 1726882752.37069: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:39:12 -0400 (0:00:00.031) 0:00:05.536 ****** 28173 1726882752.37219: entering _queue_task() for managed_node2/debug 28173 1726882752.37592: worker is 1 (out of 1 available) 28173 1726882752.37614: exiting _queue_task() for managed_node2/debug 28173 1726882752.37640: done queuing things up, now waiting for results queue to drain 28173 1726882752.37642: waiting for pending results... 28173 1726882752.37935: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 28173 1726882752.38052: in run() - task 0e448fcc-3ce9-926c-8928-00000000036a 28173 1726882752.38083: variable 'ansible_search_path' from source: unknown 28173 1726882752.38092: variable 'ansible_search_path' from source: unknown 28173 1726882752.38132: calling self._execute() 28173 1726882752.38240: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882752.38251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882752.38265: variable 'omit' from source: magic vars 28173 1726882752.38656: variable 'ansible_distribution_major_version' from source: facts 28173 1726882752.38678: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882752.38690: variable 'omit' from source: magic vars 28173 1726882752.38745: variable 'omit' from source: magic vars 28173 1726882752.38850: variable 'current_interfaces' from source: set_fact 28173 1726882752.38883: variable 'omit' from source: magic vars 28173 1726882752.38926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882752.38974: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882752.38999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882752.39021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882752.39038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882752.39084: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882752.39094: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882752.39102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882752.39215: Set connection var ansible_pipelining to False 28173 1726882752.39223: Set connection var ansible_shell_type to sh 28173 1726882752.39235: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882752.39247: Set connection var ansible_timeout to 10 28173 1726882752.39257: Set connection var ansible_shell_executable to /bin/sh 28173 1726882752.39273: Set connection var ansible_connection to ssh 28173 1726882752.39301: variable 'ansible_shell_executable' from source: unknown 28173 1726882752.39309: variable 'ansible_connection' from source: unknown 28173 1726882752.39316: variable 'ansible_module_compression' from source: unknown 28173 1726882752.39322: variable 'ansible_shell_type' from source: unknown 28173 1726882752.39329: variable 'ansible_shell_executable' from source: unknown 28173 1726882752.39335: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882752.39343: variable 'ansible_pipelining' from source: unknown 28173 1726882752.39349: variable 'ansible_timeout' from source: unknown 28173 1726882752.39357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882752.39513: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882752.39529: variable 'omit' from source: magic vars 28173 1726882752.39539: starting attempt loop 28173 1726882752.39546: running the handler 28173 1726882752.39605: handler run complete 28173 1726882752.39624: attempt loop complete, returning result 28173 1726882752.39631: _execute() done 28173 1726882752.39637: dumping result to json 28173 1726882752.39644: done dumping result, returning 28173 1726882752.39655: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0e448fcc-3ce9-926c-8928-00000000036a] 28173 1726882752.39666: sending task result for task 0e448fcc-3ce9-926c-8928-00000000036a ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 28173 1726882752.39813: no more pending results, returning what we have 28173 1726882752.39816: results queue empty 28173 1726882752.39817: checking for any_errors_fatal 28173 1726882752.39822: done checking for any_errors_fatal 28173 1726882752.39823: checking for max_fail_percentage 28173 1726882752.39825: done checking for max_fail_percentage 28173 1726882752.39826: checking to see if all hosts have failed and the running result is not ok 28173 1726882752.39827: done checking to see if all hosts have failed 28173 1726882752.39828: getting the remaining hosts for this loop 28173 1726882752.39829: done getting the remaining hosts for this loop 28173 1726882752.39833: getting the next task for host managed_node2 28173 1726882752.39842: done getting next task for host managed_node2 28173 1726882752.39845: ^ task is: TASK: Install iproute 28173 1726882752.39848: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882752.39852: getting variables 28173 1726882752.39854: in VariableManager get_vars() 28173 1726882752.39895: Calling all_inventory to load vars for managed_node2 28173 1726882752.39898: Calling groups_inventory to load vars for managed_node2 28173 1726882752.39900: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882752.39911: Calling all_plugins_play to load vars for managed_node2 28173 1726882752.39913: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882752.39916: Calling groups_plugins_play to load vars for managed_node2 28173 1726882752.40100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882752.40350: done with get_vars() 28173 1726882752.40360: done getting variables 28173 1726882752.40413: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:39:12 -0400 (0:00:00.032) 0:00:05.568 ****** 28173 1726882752.40447: entering _queue_task() for managed_node2/package 28173 1726882752.40463: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000036a 28173 1726882752.40473: WORKER PROCESS EXITING 28173 1726882752.40900: worker is 1 (out of 1 available) 28173 1726882752.40912: exiting _queue_task() for managed_node2/package 28173 1726882752.40923: done queuing things up, now waiting for results queue to drain 28173 1726882752.40925: waiting for pending results... 28173 1726882752.41160: running TaskExecutor() for managed_node2/TASK: Install iproute 28173 1726882752.41266: in run() - task 0e448fcc-3ce9-926c-8928-00000000026d 28173 1726882752.41292: variable 'ansible_search_path' from source: unknown 28173 1726882752.41300: variable 'ansible_search_path' from source: unknown 28173 1726882752.41346: calling self._execute() 28173 1726882752.41445: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882752.41455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882752.41481: variable 'omit' from source: magic vars 28173 1726882752.41881: variable 'ansible_distribution_major_version' from source: facts 28173 1726882752.41899: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882752.41914: variable 'omit' from source: magic vars 28173 1726882752.41955: variable 'omit' from source: magic vars 28173 1726882752.42175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882752.44578: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882752.44943: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882752.44986: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882752.45028: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882752.45056: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882752.45151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882752.45185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882752.45221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882752.45267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882752.45287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882752.45394: variable '__network_is_ostree' from source: set_fact 28173 1726882752.45403: variable 'omit' from source: magic vars 28173 1726882752.45438: variable 'omit' from source: magic vars 28173 1726882752.45469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882752.45501: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882752.45523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882752.45548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882752.45561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882752.45594: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882752.45602: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882752.45609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882752.45709: Set connection var ansible_pipelining to False 28173 1726882752.45716: Set connection var ansible_shell_type to sh 28173 1726882752.45729: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882752.45740: Set connection var ansible_timeout to 10 28173 1726882752.45748: Set connection var ansible_shell_executable to /bin/sh 28173 1726882752.45760: Set connection var ansible_connection to ssh 28173 1726882752.45788: variable 'ansible_shell_executable' from source: unknown 28173 1726882752.45796: variable 'ansible_connection' from source: unknown 28173 1726882752.45802: variable 'ansible_module_compression' from source: unknown 28173 1726882752.45808: variable 'ansible_shell_type' from source: unknown 28173 1726882752.45814: variable 'ansible_shell_executable' from source: unknown 28173 1726882752.45820: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882752.45826: variable 'ansible_pipelining' from source: unknown 28173 1726882752.45832: variable 'ansible_timeout' from source: unknown 28173 1726882752.45838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882752.45938: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882752.45952: variable 'omit' from source: magic vars 28173 1726882752.45961: starting attempt loop 28173 1726882752.45972: running the handler 28173 1726882752.45982: variable 'ansible_facts' from source: unknown 28173 1726882752.45988: variable 'ansible_facts' from source: unknown 28173 1726882752.46021: _low_level_execute_command(): starting 28173 1726882752.46032: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882752.46786: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882752.46800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.46813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.46829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.46877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882752.46888: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882752.46901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.46916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882752.46927: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882752.46936: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882752.46950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.46962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.46981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.46993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882752.47003: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882752.47014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.47095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882752.47124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882752.47139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882752.47279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882752.48936: stdout chunk (state=3): >>>/root <<< 28173 1726882752.49127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882752.49131: stdout chunk (state=3): >>><<< 28173 1726882752.49133: stderr chunk (state=3): >>><<< 28173 1726882752.49241: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882752.49245: _low_level_execute_command(): starting 28173 1726882752.49248: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882752.4915233-28501-236222152236708 `" && echo ansible-tmp-1726882752.4915233-28501-236222152236708="` echo /root/.ansible/tmp/ansible-tmp-1726882752.4915233-28501-236222152236708 `" ) && sleep 0' 28173 1726882752.49851: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882752.49868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.49886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.49915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.49956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882752.49971: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882752.49986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.50010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882752.50029: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882752.50042: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882752.50054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.50070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.50087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.50100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882752.50113: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882752.50135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.50213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882752.50243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882752.50261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882752.50391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882752.52309: stdout chunk (state=3): >>>ansible-tmp-1726882752.4915233-28501-236222152236708=/root/.ansible/tmp/ansible-tmp-1726882752.4915233-28501-236222152236708 <<< 28173 1726882752.52492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882752.52495: stdout chunk (state=3): >>><<< 28173 1726882752.52498: stderr chunk (state=3): >>><<< 28173 1726882752.52773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882752.4915233-28501-236222152236708=/root/.ansible/tmp/ansible-tmp-1726882752.4915233-28501-236222152236708 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882752.52776: variable 'ansible_module_compression' from source: unknown 28173 1726882752.52780: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 28173 1726882752.52782: ANSIBALLZ: Acquiring lock 28173 1726882752.52785: ANSIBALLZ: Lock acquired: 140243978110592 28173 1726882752.52787: ANSIBALLZ: Creating module 28173 1726882752.70458: ANSIBALLZ: Writing module into payload 28173 1726882752.70658: ANSIBALLZ: Writing module 28173 1726882752.70680: ANSIBALLZ: Renaming module 28173 1726882752.70689: ANSIBALLZ: Done creating module 28173 1726882752.70705: variable 'ansible_facts' from source: unknown 28173 1726882752.70760: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882752.4915233-28501-236222152236708/AnsiballZ_dnf.py 28173 1726882752.70867: Sending initial data 28173 1726882752.70871: Sent initial data (152 bytes) 28173 1726882752.71483: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.71486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.71498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882752.71506: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882752.71511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.71532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.71534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.71597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882752.71602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882752.71606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882752.71708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882752.73526: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 28173 1726882752.73533: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882752.73621: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882752.73718: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpp4le623a /root/.ansible/tmp/ansible-tmp-1726882752.4915233-28501-236222152236708/AnsiballZ_dnf.py <<< 28173 1726882752.73811: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882752.75075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882752.75163: stderr chunk (state=3): >>><<< 28173 1726882752.75167: stdout chunk (state=3): >>><<< 28173 1726882752.75188: done transferring module to remote 28173 1726882752.75196: _low_level_execute_command(): starting 28173 1726882752.75201: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882752.4915233-28501-236222152236708/ /root/.ansible/tmp/ansible-tmp-1726882752.4915233-28501-236222152236708/AnsiballZ_dnf.py && sleep 0' 28173 1726882752.75629: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882752.75633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.75666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.75670: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.75672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.75733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882752.75736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882752.75836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882752.77605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882752.77644: stderr chunk (state=3): >>><<< 28173 1726882752.77647: stdout chunk (state=3): >>><<< 28173 1726882752.77663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882752.77670: _low_level_execute_command(): starting 28173 1726882752.77673: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882752.4915233-28501-236222152236708/AnsiballZ_dnf.py && sleep 0' 28173 1726882752.78086: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882752.78089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.78123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.78126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882752.78128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882752.78180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882752.78183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882752.78292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882753.81447: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 28173 1726882753.87627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882753.87679: stderr chunk (state=3): >>><<< 28173 1726882753.87682: stdout chunk (state=3): >>><<< 28173 1726882753.87698: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882753.87736: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882752.4915233-28501-236222152236708/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882753.87741: _low_level_execute_command(): starting 28173 1726882753.87746: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882752.4915233-28501-236222152236708/ > /dev/null 2>&1 && sleep 0' 28173 1726882753.88211: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882753.88217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882753.88261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882753.88267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882753.88270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882753.88281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882753.88325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882753.88331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882753.88446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882753.90271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882753.90316: stderr chunk (state=3): >>><<< 28173 1726882753.90319: stdout chunk (state=3): >>><<< 28173 1726882753.90331: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882753.90337: handler run complete 28173 1726882753.90448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882753.90572: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882753.90599: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882753.90621: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882753.90644: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882753.90701: variable '__install_status' from source: unknown 28173 1726882753.90715: Evaluated conditional (__install_status is success): True 28173 1726882753.90726: attempt loop complete, returning result 28173 1726882753.90729: _execute() done 28173 1726882753.90731: dumping result to json 28173 1726882753.90736: done dumping result, returning 28173 1726882753.90744: done running TaskExecutor() for managed_node2/TASK: Install iproute [0e448fcc-3ce9-926c-8928-00000000026d] 28173 1726882753.90749: sending task result for task 0e448fcc-3ce9-926c-8928-00000000026d 28173 1726882753.90842: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000026d 28173 1726882753.90847: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 28173 1726882753.90926: no more pending results, returning what we have 28173 1726882753.90929: results queue empty 28173 1726882753.90929: checking for any_errors_fatal 28173 1726882753.90936: done checking for any_errors_fatal 28173 1726882753.90937: checking for max_fail_percentage 28173 1726882753.90938: done checking for max_fail_percentage 28173 1726882753.90939: checking to see if all hosts have failed and the running result is not ok 28173 1726882753.90940: done checking to see if all hosts have failed 28173 1726882753.90941: getting the remaining hosts for this loop 28173 1726882753.90942: done getting the remaining hosts for this loop 28173 1726882753.90946: getting the next task for host managed_node2 28173 1726882753.90951: done getting next task for host managed_node2 28173 1726882753.90954: ^ task is: TASK: Create veth interface {{ interface }} 28173 1726882753.90958: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882753.90961: getting variables 28173 1726882753.90963: in VariableManager get_vars() 28173 1726882753.91005: Calling all_inventory to load vars for managed_node2 28173 1726882753.91007: Calling groups_inventory to load vars for managed_node2 28173 1726882753.91010: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882753.91020: Calling all_plugins_play to load vars for managed_node2 28173 1726882753.91022: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882753.91024: Calling groups_plugins_play to load vars for managed_node2 28173 1726882753.91187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882753.91315: done with get_vars() 28173 1726882753.91324: done getting variables 28173 1726882753.91369: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882753.91488: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:39:13 -0400 (0:00:01.510) 0:00:07.079 ****** 28173 1726882753.91518: entering _queue_task() for managed_node2/command 28173 1726882753.91771: worker is 1 (out of 1 available) 28173 1726882753.91782: exiting _queue_task() for managed_node2/command 28173 1726882753.91794: done queuing things up, now waiting for results queue to drain 28173 1726882753.91795: waiting for pending results... 28173 1726882753.92047: running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest0 28173 1726882753.92155: in run() - task 0e448fcc-3ce9-926c-8928-00000000026e 28173 1726882753.92180: variable 'ansible_search_path' from source: unknown 28173 1726882753.92188: variable 'ansible_search_path' from source: unknown 28173 1726882753.92443: variable 'interface' from source: set_fact 28173 1726882753.92529: variable 'interface' from source: set_fact 28173 1726882753.92609: variable 'interface' from source: set_fact 28173 1726882753.92749: Loaded config def from plugin (lookup/items) 28173 1726882753.92761: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 28173 1726882753.92796: variable 'omit' from source: magic vars 28173 1726882753.92921: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882753.92936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882753.92951: variable 'omit' from source: magic vars 28173 1726882753.93237: variable 'ansible_distribution_major_version' from source: facts 28173 1726882753.93250: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882753.93457: variable 'type' from source: set_fact 28173 1726882753.93473: variable 'state' from source: include params 28173 1726882753.93484: variable 'interface' from source: set_fact 28173 1726882753.93492: variable 'current_interfaces' from source: set_fact 28173 1726882753.93503: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 28173 1726882753.93514: variable 'omit' from source: magic vars 28173 1726882753.93558: variable 'omit' from source: magic vars 28173 1726882753.93610: variable 'item' from source: unknown 28173 1726882753.93689: variable 'item' from source: unknown 28173 1726882753.93712: variable 'omit' from source: magic vars 28173 1726882753.93747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882753.93789: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882753.93811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882753.93832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882753.93848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882753.93890: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882753.93899: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882753.93907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882753.94016: Set connection var ansible_pipelining to False 28173 1726882753.94024: Set connection var ansible_shell_type to sh 28173 1726882753.94037: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882753.94049: Set connection var ansible_timeout to 10 28173 1726882753.94059: Set connection var ansible_shell_executable to /bin/sh 28173 1726882753.94073: Set connection var ansible_connection to ssh 28173 1726882753.94102: variable 'ansible_shell_executable' from source: unknown 28173 1726882753.94110: variable 'ansible_connection' from source: unknown 28173 1726882753.94116: variable 'ansible_module_compression' from source: unknown 28173 1726882753.94123: variable 'ansible_shell_type' from source: unknown 28173 1726882753.94130: variable 'ansible_shell_executable' from source: unknown 28173 1726882753.94137: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882753.94144: variable 'ansible_pipelining' from source: unknown 28173 1726882753.94151: variable 'ansible_timeout' from source: unknown 28173 1726882753.94159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882753.94304: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882753.94322: variable 'omit' from source: magic vars 28173 1726882753.94331: starting attempt loop 28173 1726882753.94338: running the handler 28173 1726882753.94358: _low_level_execute_command(): starting 28173 1726882753.94376: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882753.95113: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882753.95127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882753.95142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882753.95161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882753.95212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882753.95224: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882753.95239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882753.95258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882753.95278: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882753.95294: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882753.95307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882753.95320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882753.95337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882753.95350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882753.95361: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882753.95380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882753.95457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882753.95486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882753.95504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882753.95732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882753.97278: stdout chunk (state=3): >>>/root <<< 28173 1726882753.97384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882753.97450: stderr chunk (state=3): >>><<< 28173 1726882753.97453: stdout chunk (state=3): >>><<< 28173 1726882753.97563: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882753.97575: _low_level_execute_command(): starting 28173 1726882753.97578: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882753.9747443-28561-16940039230273 `" && echo ansible-tmp-1726882753.9747443-28561-16940039230273="` echo /root/.ansible/tmp/ansible-tmp-1726882753.9747443-28561-16940039230273 `" ) && sleep 0' 28173 1726882753.98159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882753.98177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882753.98192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882753.98211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882753.98256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882753.98272: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882753.98287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882753.98304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882753.98316: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882753.98333: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882753.98345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882753.98359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882753.98382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882753.98395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882753.98406: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882753.98419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882753.98502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882753.98523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882753.98539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882753.98684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.00601: stdout chunk (state=3): >>>ansible-tmp-1726882753.9747443-28561-16940039230273=/root/.ansible/tmp/ansible-tmp-1726882753.9747443-28561-16940039230273 <<< 28173 1726882754.00798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.00801: stdout chunk (state=3): >>><<< 28173 1726882754.00804: stderr chunk (state=3): >>><<< 28173 1726882754.01075: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882753.9747443-28561-16940039230273=/root/.ansible/tmp/ansible-tmp-1726882753.9747443-28561-16940039230273 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882754.01079: variable 'ansible_module_compression' from source: unknown 28173 1726882754.01081: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28173 1726882754.01083: variable 'ansible_facts' from source: unknown 28173 1726882754.01085: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882753.9747443-28561-16940039230273/AnsiballZ_command.py 28173 1726882754.01183: Sending initial data 28173 1726882754.01185: Sent initial data (155 bytes) 28173 1726882754.02160: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882754.02184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.02200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.02220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.02261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882754.02285: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882754.02302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.02319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882754.02331: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882754.02348: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882754.02370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.02401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.02424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.02443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882754.02461: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882754.02481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.02559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.02586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882754.02607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.02743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.04514: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882754.04608: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882754.04708: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp6w6q9cko /root/.ansible/tmp/ansible-tmp-1726882753.9747443-28561-16940039230273/AnsiballZ_command.py <<< 28173 1726882754.04802: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882754.06307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.06371: stderr chunk (state=3): >>><<< 28173 1726882754.06374: stdout chunk (state=3): >>><<< 28173 1726882754.06376: done transferring module to remote 28173 1726882754.06378: _low_level_execute_command(): starting 28173 1726882754.06381: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882753.9747443-28561-16940039230273/ /root/.ansible/tmp/ansible-tmp-1726882753.9747443-28561-16940039230273/AnsiballZ_command.py && sleep 0' 28173 1726882754.06987: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882754.07001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.07016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.07034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.07081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882754.07094: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882754.07108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.07126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882754.07139: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882754.07151: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882754.07168: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.07183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.07199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.07211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882754.07222: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882754.07236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.07312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.07333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882754.07349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.07488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.09346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.09349: stdout chunk (state=3): >>><<< 28173 1726882754.09351: stderr chunk (state=3): >>><<< 28173 1726882754.09440: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882754.09444: _low_level_execute_command(): starting 28173 1726882754.09447: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882753.9747443-28561-16940039230273/AnsiballZ_command.py && sleep 0' 28173 1726882754.12204: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.12208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.12243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.12249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.12252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.12432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.12497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882754.12500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.12818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.27420: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 21:39:14.259706", "end": "2024-09-20 21:39:14.268403", "delta": "0:00:00.008697", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882754.29776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882754.29780: stdout chunk (state=3): >>><<< 28173 1726882754.29782: stderr chunk (state=3): >>><<< 28173 1726882754.29929: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 21:39:14.259706", "end": "2024-09-20 21:39:14.268403", "delta": "0:00:00.008697", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882754.29938: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882753.9747443-28561-16940039230273/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882754.29940: _low_level_execute_command(): starting 28173 1726882754.29943: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882753.9747443-28561-16940039230273/ > /dev/null 2>&1 && sleep 0' 28173 1726882754.31052: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882754.31067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.31083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.31110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.31149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882754.31161: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882754.31180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.31198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882754.31211: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882754.31227: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882754.31241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.31266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.31287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.31297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882754.31308: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882754.31322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.31412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.31436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882754.31450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.31601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.34239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.34242: stdout chunk (state=3): >>><<< 28173 1726882754.34245: stderr chunk (state=3): >>><<< 28173 1726882754.34873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882754.34876: handler run complete 28173 1726882754.34879: Evaluated conditional (False): False 28173 1726882754.34881: attempt loop complete, returning result 28173 1726882754.34883: variable 'item' from source: unknown 28173 1726882754.34885: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.008697", "end": "2024-09-20 21:39:14.268403", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-20 21:39:14.259706" } 28173 1726882754.35116: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882754.35120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882754.35123: variable 'omit' from source: magic vars 28173 1726882754.35126: variable 'ansible_distribution_major_version' from source: facts 28173 1726882754.35128: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882754.35189: variable 'type' from source: set_fact 28173 1726882754.35199: variable 'state' from source: include params 28173 1726882754.35207: variable 'interface' from source: set_fact 28173 1726882754.35210: variable 'current_interfaces' from source: set_fact 28173 1726882754.35216: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 28173 1726882754.35223: variable 'omit' from source: magic vars 28173 1726882754.35269: variable 'omit' from source: magic vars 28173 1726882754.35312: variable 'item' from source: unknown 28173 1726882754.35389: variable 'item' from source: unknown 28173 1726882754.35407: variable 'omit' from source: magic vars 28173 1726882754.35430: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882754.35447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882754.35473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882754.35491: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882754.35498: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882754.35504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882754.35606: Set connection var ansible_pipelining to False 28173 1726882754.35616: Set connection var ansible_shell_type to sh 28173 1726882754.35629: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882754.35640: Set connection var ansible_timeout to 10 28173 1726882754.35648: Set connection var ansible_shell_executable to /bin/sh 28173 1726882754.35655: Set connection var ansible_connection to ssh 28173 1726882754.35692: variable 'ansible_shell_executable' from source: unknown 28173 1726882754.35699: variable 'ansible_connection' from source: unknown 28173 1726882754.35705: variable 'ansible_module_compression' from source: unknown 28173 1726882754.35714: variable 'ansible_shell_type' from source: unknown 28173 1726882754.35720: variable 'ansible_shell_executable' from source: unknown 28173 1726882754.35732: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882754.35740: variable 'ansible_pipelining' from source: unknown 28173 1726882754.35746: variable 'ansible_timeout' from source: unknown 28173 1726882754.35752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882754.35873: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882754.35889: variable 'omit' from source: magic vars 28173 1726882754.35911: starting attempt loop 28173 1726882754.35918: running the handler 28173 1726882754.35929: _low_level_execute_command(): starting 28173 1726882754.35937: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882754.37160: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882754.37179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.37197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.37223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.37270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882754.37283: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882754.37302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.37325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882754.37336: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882754.37350: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882754.37361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.37381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.37396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.37407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882754.37417: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882754.37434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.37517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.37545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882754.37584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.37728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.39342: stdout chunk (state=3): >>>/root <<< 28173 1726882754.39447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.39495: stderr chunk (state=3): >>><<< 28173 1726882754.39497: stdout chunk (state=3): >>><<< 28173 1726882754.39571: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882754.39576: _low_level_execute_command(): starting 28173 1726882754.39582: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882754.3950996-28561-42911332144469 `" && echo ansible-tmp-1726882754.3950996-28561-42911332144469="` echo /root/.ansible/tmp/ansible-tmp-1726882754.3950996-28561-42911332144469 `" ) && sleep 0' 28173 1726882754.40214: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882754.40245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.40275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.40305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.40401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882754.40411: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882754.40424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.40442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882754.40468: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882754.40509: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882754.40512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.40514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.40573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.40576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.40698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.42581: stdout chunk (state=3): >>>ansible-tmp-1726882754.3950996-28561-42911332144469=/root/.ansible/tmp/ansible-tmp-1726882754.3950996-28561-42911332144469 <<< 28173 1726882754.42699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.42744: stderr chunk (state=3): >>><<< 28173 1726882754.42748: stdout chunk (state=3): >>><<< 28173 1726882754.42770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882754.3950996-28561-42911332144469=/root/.ansible/tmp/ansible-tmp-1726882754.3950996-28561-42911332144469 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882754.42786: variable 'ansible_module_compression' from source: unknown 28173 1726882754.42815: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28173 1726882754.42830: variable 'ansible_facts' from source: unknown 28173 1726882754.42882: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882754.3950996-28561-42911332144469/AnsiballZ_command.py 28173 1726882754.42978: Sending initial data 28173 1726882754.42981: Sent initial data (155 bytes) 28173 1726882754.43610: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.43649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.43652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.43654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.43711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.43714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.43821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.45592: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 28173 1726882754.45598: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882754.45688: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882754.45788: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpqd66tpq9 /root/.ansible/tmp/ansible-tmp-1726882754.3950996-28561-42911332144469/AnsiballZ_command.py <<< 28173 1726882754.45884: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882754.46906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.46995: stderr chunk (state=3): >>><<< 28173 1726882754.46998: stdout chunk (state=3): >>><<< 28173 1726882754.47011: done transferring module to remote 28173 1726882754.47018: _low_level_execute_command(): starting 28173 1726882754.47023: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882754.3950996-28561-42911332144469/ /root/.ansible/tmp/ansible-tmp-1726882754.3950996-28561-42911332144469/AnsiballZ_command.py && sleep 0' 28173 1726882754.47437: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.47442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.47474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.47489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.47499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.47543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.47554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.47659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.49454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.49496: stderr chunk (state=3): >>><<< 28173 1726882754.49499: stdout chunk (state=3): >>><<< 28173 1726882754.49512: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882754.49517: _low_level_execute_command(): starting 28173 1726882754.49524: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882754.3950996-28561-42911332144469/AnsiballZ_command.py && sleep 0' 28173 1726882754.49912: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.49918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.49950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.49961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.49975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.50020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.50031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.50142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.63654: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 21:39:14.631064", "end": "2024-09-20 21:39:14.634615", "delta": "0:00:00.003551", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882754.64822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882754.64876: stderr chunk (state=3): >>><<< 28173 1726882754.64879: stdout chunk (state=3): >>><<< 28173 1726882754.64893: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 21:39:14.631064", "end": "2024-09-20 21:39:14.634615", "delta": "0:00:00.003551", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882754.64919: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882754.3950996-28561-42911332144469/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882754.64924: _low_level_execute_command(): starting 28173 1726882754.64929: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882754.3950996-28561-42911332144469/ > /dev/null 2>&1 && sleep 0' 28173 1726882754.65376: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.65381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.65451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.65479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.65491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.65595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.67435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.67480: stderr chunk (state=3): >>><<< 28173 1726882754.67483: stdout chunk (state=3): >>><<< 28173 1726882754.67500: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882754.67503: handler run complete 28173 1726882754.67517: Evaluated conditional (False): False 28173 1726882754.67524: attempt loop complete, returning result 28173 1726882754.67539: variable 'item' from source: unknown 28173 1726882754.67601: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003551", "end": "2024-09-20 21:39:14.634615", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-20 21:39:14.631064" } 28173 1726882754.67720: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882754.67723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882754.67726: variable 'omit' from source: magic vars 28173 1726882754.67819: variable 'ansible_distribution_major_version' from source: facts 28173 1726882754.67822: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882754.67939: variable 'type' from source: set_fact 28173 1726882754.67944: variable 'state' from source: include params 28173 1726882754.67947: variable 'interface' from source: set_fact 28173 1726882754.67953: variable 'current_interfaces' from source: set_fact 28173 1726882754.67956: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 28173 1726882754.67958: variable 'omit' from source: magic vars 28173 1726882754.67974: variable 'omit' from source: magic vars 28173 1726882754.67999: variable 'item' from source: unknown 28173 1726882754.68040: variable 'item' from source: unknown 28173 1726882754.68052: variable 'omit' from source: magic vars 28173 1726882754.68074: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882754.68078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882754.68081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882754.68093: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882754.68095: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882754.68097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882754.68142: Set connection var ansible_pipelining to False 28173 1726882754.68145: Set connection var ansible_shell_type to sh 28173 1726882754.68151: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882754.68157: Set connection var ansible_timeout to 10 28173 1726882754.68162: Set connection var ansible_shell_executable to /bin/sh 28173 1726882754.68168: Set connection var ansible_connection to ssh 28173 1726882754.68184: variable 'ansible_shell_executable' from source: unknown 28173 1726882754.68186: variable 'ansible_connection' from source: unknown 28173 1726882754.68189: variable 'ansible_module_compression' from source: unknown 28173 1726882754.68195: variable 'ansible_shell_type' from source: unknown 28173 1726882754.68197: variable 'ansible_shell_executable' from source: unknown 28173 1726882754.68199: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882754.68201: variable 'ansible_pipelining' from source: unknown 28173 1726882754.68203: variable 'ansible_timeout' from source: unknown 28173 1726882754.68205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882754.68267: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882754.68271: variable 'omit' from source: magic vars 28173 1726882754.68276: starting attempt loop 28173 1726882754.68278: running the handler 28173 1726882754.68289: _low_level_execute_command(): starting 28173 1726882754.68291: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882754.68706: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.68711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.68747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882754.68759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.68807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.68819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.68925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.70507: stdout chunk (state=3): >>>/root <<< 28173 1726882754.70601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.70642: stderr chunk (state=3): >>><<< 28173 1726882754.70646: stdout chunk (state=3): >>><<< 28173 1726882754.70667: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882754.70673: _low_level_execute_command(): starting 28173 1726882754.70679: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882754.7066257-28561-265672358239662 `" && echo ansible-tmp-1726882754.7066257-28561-265672358239662="` echo /root/.ansible/tmp/ansible-tmp-1726882754.7066257-28561-265672358239662 `" ) && sleep 0' 28173 1726882754.71104: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.71110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.71137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.71148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.71196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.71208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882754.71219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.71328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.73209: stdout chunk (state=3): >>>ansible-tmp-1726882754.7066257-28561-265672358239662=/root/.ansible/tmp/ansible-tmp-1726882754.7066257-28561-265672358239662 <<< 28173 1726882754.73319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.73361: stderr chunk (state=3): >>><<< 28173 1726882754.73369: stdout chunk (state=3): >>><<< 28173 1726882754.73378: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882754.7066257-28561-265672358239662=/root/.ansible/tmp/ansible-tmp-1726882754.7066257-28561-265672358239662 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882754.73393: variable 'ansible_module_compression' from source: unknown 28173 1726882754.73421: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28173 1726882754.73434: variable 'ansible_facts' from source: unknown 28173 1726882754.73489: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882754.7066257-28561-265672358239662/AnsiballZ_command.py 28173 1726882754.73579: Sending initial data 28173 1726882754.73582: Sent initial data (156 bytes) 28173 1726882754.74199: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.74204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.74240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.74251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.74301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.74312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.74417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.76206: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882754.76300: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882754.76398: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpqpq_cjmu /root/.ansible/tmp/ansible-tmp-1726882754.7066257-28561-265672358239662/AnsiballZ_command.py <<< 28173 1726882754.76494: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882754.77510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.77596: stderr chunk (state=3): >>><<< 28173 1726882754.77600: stdout chunk (state=3): >>><<< 28173 1726882754.77612: done transferring module to remote 28173 1726882754.77618: _low_level_execute_command(): starting 28173 1726882754.77622: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882754.7066257-28561-265672358239662/ /root/.ansible/tmp/ansible-tmp-1726882754.7066257-28561-265672358239662/AnsiballZ_command.py && sleep 0' 28173 1726882754.78020: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.78026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.78054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.78071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.78119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.78131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.78235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.79995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.80035: stderr chunk (state=3): >>><<< 28173 1726882754.80040: stdout chunk (state=3): >>><<< 28173 1726882754.80052: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882754.80058: _low_level_execute_command(): starting 28173 1726882754.80061: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882754.7066257-28561-265672358239662/AnsiballZ_command.py && sleep 0' 28173 1726882754.80453: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.80469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.80489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.80501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.80545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.80556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.80676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.94453: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 21:39:14.935503", "end": "2024-09-20 21:39:14.941792", "delta": "0:00:00.006289", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882754.95688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882754.95743: stderr chunk (state=3): >>><<< 28173 1726882754.95746: stdout chunk (state=3): >>><<< 28173 1726882754.95886: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 21:39:14.935503", "end": "2024-09-20 21:39:14.941792", "delta": "0:00:00.006289", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882754.95890: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882754.7066257-28561-265672358239662/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882754.95892: _low_level_execute_command(): starting 28173 1726882754.95895: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882754.7066257-28561-265672358239662/ > /dev/null 2>&1 && sleep 0' 28173 1726882754.96526: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882754.96551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.96573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.96592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.96635: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882754.96655: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882754.96681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.96700: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882754.96713: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882754.96726: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882754.96737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882754.96751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882754.96781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882754.96795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882754.96808: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882754.96824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882754.96911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882754.96933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882754.96950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882754.97093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882754.98898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882754.98979: stderr chunk (state=3): >>><<< 28173 1726882754.98990: stdout chunk (state=3): >>><<< 28173 1726882754.99177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882754.99185: handler run complete 28173 1726882754.99187: Evaluated conditional (False): False 28173 1726882754.99190: attempt loop complete, returning result 28173 1726882754.99192: variable 'item' from source: unknown 28173 1726882754.99194: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.006289", "end": "2024-09-20 21:39:14.941792", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-20 21:39:14.935503" } 28173 1726882754.99403: dumping result to json 28173 1726882754.99406: done dumping result, returning 28173 1726882754.99408: done running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest0 [0e448fcc-3ce9-926c-8928-00000000026e] 28173 1726882754.99411: sending task result for task 0e448fcc-3ce9-926c-8928-00000000026e 28173 1726882755.00046: no more pending results, returning what we have 28173 1726882755.00049: results queue empty 28173 1726882755.00050: checking for any_errors_fatal 28173 1726882755.00053: done checking for any_errors_fatal 28173 1726882755.00054: checking for max_fail_percentage 28173 1726882755.00055: done checking for max_fail_percentage 28173 1726882755.00056: checking to see if all hosts have failed and the running result is not ok 28173 1726882755.00056: done checking to see if all hosts have failed 28173 1726882755.00057: getting the remaining hosts for this loop 28173 1726882755.00058: done getting the remaining hosts for this loop 28173 1726882755.00061: getting the next task for host managed_node2 28173 1726882755.00070: done getting next task for host managed_node2 28173 1726882755.00073: ^ task is: TASK: Set up veth as managed by NetworkManager 28173 1726882755.00075: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882755.00079: getting variables 28173 1726882755.00080: in VariableManager get_vars() 28173 1726882755.00108: Calling all_inventory to load vars for managed_node2 28173 1726882755.00111: Calling groups_inventory to load vars for managed_node2 28173 1726882755.00113: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882755.00122: Calling all_plugins_play to load vars for managed_node2 28173 1726882755.00124: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882755.00126: Calling groups_plugins_play to load vars for managed_node2 28173 1726882755.00294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882755.00523: done with get_vars() 28173 1726882755.00534: done getting variables 28173 1726882755.00588: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000026e 28173 1726882755.00591: WORKER PROCESS EXITING 28173 1726882755.00608: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:39:15 -0400 (0:00:01.091) 0:00:08.170 ****** 28173 1726882755.00636: entering _queue_task() for managed_node2/command 28173 1726882755.00886: worker is 1 (out of 1 available) 28173 1726882755.00903: exiting _queue_task() for managed_node2/command 28173 1726882755.00916: done queuing things up, now waiting for results queue to drain 28173 1726882755.00917: waiting for pending results... 28173 1726882755.01177: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 28173 1726882755.01290: in run() - task 0e448fcc-3ce9-926c-8928-00000000026f 28173 1726882755.01311: variable 'ansible_search_path' from source: unknown 28173 1726882755.01320: variable 'ansible_search_path' from source: unknown 28173 1726882755.01370: calling self._execute() 28173 1726882755.01456: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.01476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.01490: variable 'omit' from source: magic vars 28173 1726882755.01855: variable 'ansible_distribution_major_version' from source: facts 28173 1726882755.01878: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882755.02046: variable 'type' from source: set_fact 28173 1726882755.02055: variable 'state' from source: include params 28173 1726882755.02067: Evaluated conditional (type == 'veth' and state == 'present'): True 28173 1726882755.02080: variable 'omit' from source: magic vars 28173 1726882755.02126: variable 'omit' from source: magic vars 28173 1726882755.02231: variable 'interface' from source: set_fact 28173 1726882755.02252: variable 'omit' from source: magic vars 28173 1726882755.02296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882755.02338: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882755.02360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882755.02385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882755.02401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882755.02439: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882755.02446: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.02453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.02558: Set connection var ansible_pipelining to False 28173 1726882755.02570: Set connection var ansible_shell_type to sh 28173 1726882755.02583: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882755.02595: Set connection var ansible_timeout to 10 28173 1726882755.02603: Set connection var ansible_shell_executable to /bin/sh 28173 1726882755.02611: Set connection var ansible_connection to ssh 28173 1726882755.02635: variable 'ansible_shell_executable' from source: unknown 28173 1726882755.02641: variable 'ansible_connection' from source: unknown 28173 1726882755.02653: variable 'ansible_module_compression' from source: unknown 28173 1726882755.02659: variable 'ansible_shell_type' from source: unknown 28173 1726882755.02669: variable 'ansible_shell_executable' from source: unknown 28173 1726882755.02677: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.02684: variable 'ansible_pipelining' from source: unknown 28173 1726882755.02690: variable 'ansible_timeout' from source: unknown 28173 1726882755.02697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.02845: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882755.02860: variable 'omit' from source: magic vars 28173 1726882755.02879: starting attempt loop 28173 1726882755.02886: running the handler 28173 1726882755.02902: _low_level_execute_command(): starting 28173 1726882755.02913: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882755.03692: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882755.03708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.03724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.03750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.03802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.03815: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882755.03829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.03848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882755.03872: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882755.03886: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882755.03899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.03913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.03930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.03943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.03954: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882755.03978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.04052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882755.04087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882755.04105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882755.04242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882755.05905: stdout chunk (state=3): >>>/root <<< 28173 1726882755.06015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882755.06098: stderr chunk (state=3): >>><<< 28173 1726882755.06112: stdout chunk (state=3): >>><<< 28173 1726882755.06228: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882755.06232: _low_level_execute_command(): starting 28173 1726882755.06242: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882755.061415-28612-270047033036402 `" && echo ansible-tmp-1726882755.061415-28612-270047033036402="` echo /root/.ansible/tmp/ansible-tmp-1726882755.061415-28612-270047033036402 `" ) && sleep 0' 28173 1726882755.06841: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882755.06855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.06875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.06894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.06943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.06956: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882755.06975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.06993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882755.07005: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882755.07026: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882755.07039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.07054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.07076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.07090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.07102: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882755.07118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.07202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882755.07223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882755.07248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882755.07385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882755.09284: stdout chunk (state=3): >>>ansible-tmp-1726882755.061415-28612-270047033036402=/root/.ansible/tmp/ansible-tmp-1726882755.061415-28612-270047033036402 <<< 28173 1726882755.09469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882755.09472: stdout chunk (state=3): >>><<< 28173 1726882755.09475: stderr chunk (state=3): >>><<< 28173 1726882755.09774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882755.061415-28612-270047033036402=/root/.ansible/tmp/ansible-tmp-1726882755.061415-28612-270047033036402 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882755.09778: variable 'ansible_module_compression' from source: unknown 28173 1726882755.09780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28173 1726882755.09782: variable 'ansible_facts' from source: unknown 28173 1726882755.09784: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882755.061415-28612-270047033036402/AnsiballZ_command.py 28173 1726882755.09843: Sending initial data 28173 1726882755.09846: Sent initial data (155 bytes) 28173 1726882755.10808: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882755.10823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.10838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.10857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.10906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.10918: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882755.10933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.10951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882755.10966: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882755.10979: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882755.10996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.11011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.11028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.11041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.11052: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882755.11067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.11147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882755.11171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882755.11189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882755.11319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882755.13063: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882755.13155: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882755.13257: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmprn67_pi3 /root/.ansible/tmp/ansible-tmp-1726882755.061415-28612-270047033036402/AnsiballZ_command.py <<< 28173 1726882755.13351: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882755.14712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882755.14873: stderr chunk (state=3): >>><<< 28173 1726882755.14876: stdout chunk (state=3): >>><<< 28173 1726882755.14878: done transferring module to remote 28173 1726882755.14958: _low_level_execute_command(): starting 28173 1726882755.14961: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882755.061415-28612-270047033036402/ /root/.ansible/tmp/ansible-tmp-1726882755.061415-28612-270047033036402/AnsiballZ_command.py && sleep 0' 28173 1726882755.15512: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882755.15526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.15541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.15558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.15605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.15617: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882755.15630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.15646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882755.15657: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882755.15674: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882755.15686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.15700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.15715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.15727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.15737: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882755.15750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.15821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882755.15842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882755.15858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882755.15992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882755.17790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882755.17831: stderr chunk (state=3): >>><<< 28173 1726882755.17835: stdout chunk (state=3): >>><<< 28173 1726882755.17930: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882755.17933: _low_level_execute_command(): starting 28173 1726882755.17936: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882755.061415-28612-270047033036402/AnsiballZ_command.py && sleep 0' 28173 1726882755.18641: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882755.18655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.18672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.18693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.18757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.18772: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882755.18787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.18805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882755.18823: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882755.18851: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882755.18865: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.18881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.18896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.18909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.18920: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882755.18944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.19031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882755.19062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882755.19088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882755.19252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882755.34200: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 21:39:15.320336", "end": "2024-09-20 21:39:15.339462", "delta": "0:00:00.019126", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882755.35585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882755.35644: stderr chunk (state=3): >>><<< 28173 1726882755.35647: stdout chunk (state=3): >>><<< 28173 1726882755.35773: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 21:39:15.320336", "end": "2024-09-20 21:39:15.339462", "delta": "0:00:00.019126", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882755.35782: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882755.061415-28612-270047033036402/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882755.35785: _low_level_execute_command(): starting 28173 1726882755.35788: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882755.061415-28612-270047033036402/ > /dev/null 2>&1 && sleep 0' 28173 1726882755.37325: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882755.37341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.37357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.37378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.37419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.37582: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882755.37597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.37616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882755.37628: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882755.37640: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882755.37653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.37671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.37688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.37701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.37713: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882755.37727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.37806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882755.37830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882755.37848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882755.37985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882755.39879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882755.39882: stdout chunk (state=3): >>><<< 28173 1726882755.39885: stderr chunk (state=3): >>><<< 28173 1726882755.39969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882755.39973: handler run complete 28173 1726882755.39975: Evaluated conditional (False): False 28173 1726882755.39977: attempt loop complete, returning result 28173 1726882755.39979: _execute() done 28173 1726882755.39981: dumping result to json 28173 1726882755.39983: done dumping result, returning 28173 1726882755.39986: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-926c-8928-00000000026f] 28173 1726882755.40174: sending task result for task 0e448fcc-3ce9-926c-8928-00000000026f 28173 1726882755.40251: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000026f 28173 1726882755.40254: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.019126", "end": "2024-09-20 21:39:15.339462", "rc": 0, "start": "2024-09-20 21:39:15.320336" } 28173 1726882755.40344: no more pending results, returning what we have 28173 1726882755.40348: results queue empty 28173 1726882755.40349: checking for any_errors_fatal 28173 1726882755.40362: done checking for any_errors_fatal 28173 1726882755.40365: checking for max_fail_percentage 28173 1726882755.40366: done checking for max_fail_percentage 28173 1726882755.40367: checking to see if all hosts have failed and the running result is not ok 28173 1726882755.40368: done checking to see if all hosts have failed 28173 1726882755.40369: getting the remaining hosts for this loop 28173 1726882755.40370: done getting the remaining hosts for this loop 28173 1726882755.40375: getting the next task for host managed_node2 28173 1726882755.40381: done getting next task for host managed_node2 28173 1726882755.40385: ^ task is: TASK: Delete veth interface {{ interface }} 28173 1726882755.40388: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882755.40393: getting variables 28173 1726882755.40395: in VariableManager get_vars() 28173 1726882755.40439: Calling all_inventory to load vars for managed_node2 28173 1726882755.40442: Calling groups_inventory to load vars for managed_node2 28173 1726882755.40445: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882755.40457: Calling all_plugins_play to load vars for managed_node2 28173 1726882755.40460: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882755.40465: Calling groups_plugins_play to load vars for managed_node2 28173 1726882755.40662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882755.41049: done with get_vars() 28173 1726882755.41061: done getting variables 28173 1726882755.41132: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882755.41265: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:39:15 -0400 (0:00:00.406) 0:00:08.577 ****** 28173 1726882755.41298: entering _queue_task() for managed_node2/command 28173 1726882755.41573: worker is 1 (out of 1 available) 28173 1726882755.41585: exiting _queue_task() for managed_node2/command 28173 1726882755.41597: done queuing things up, now waiting for results queue to drain 28173 1726882755.41598: waiting for pending results... 28173 1726882755.41846: running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest0 28173 1726882755.41946: in run() - task 0e448fcc-3ce9-926c-8928-000000000270 28173 1726882755.41965: variable 'ansible_search_path' from source: unknown 28173 1726882755.41972: variable 'ansible_search_path' from source: unknown 28173 1726882755.42017: calling self._execute() 28173 1726882755.42110: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.42120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.42133: variable 'omit' from source: magic vars 28173 1726882755.42489: variable 'ansible_distribution_major_version' from source: facts 28173 1726882755.42507: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882755.42723: variable 'type' from source: set_fact 28173 1726882755.42733: variable 'state' from source: include params 28173 1726882755.42742: variable 'interface' from source: set_fact 28173 1726882755.42758: variable 'current_interfaces' from source: set_fact 28173 1726882755.42773: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 28173 1726882755.42780: when evaluation is False, skipping this task 28173 1726882755.42787: _execute() done 28173 1726882755.42797: dumping result to json 28173 1726882755.42805: done dumping result, returning 28173 1726882755.42815: done running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest0 [0e448fcc-3ce9-926c-8928-000000000270] 28173 1726882755.42825: sending task result for task 0e448fcc-3ce9-926c-8928-000000000270 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 28173 1726882755.42966: no more pending results, returning what we have 28173 1726882755.42970: results queue empty 28173 1726882755.42971: checking for any_errors_fatal 28173 1726882755.42981: done checking for any_errors_fatal 28173 1726882755.42982: checking for max_fail_percentage 28173 1726882755.42984: done checking for max_fail_percentage 28173 1726882755.42984: checking to see if all hosts have failed and the running result is not ok 28173 1726882755.42986: done checking to see if all hosts have failed 28173 1726882755.42986: getting the remaining hosts for this loop 28173 1726882755.42988: done getting the remaining hosts for this loop 28173 1726882755.42991: getting the next task for host managed_node2 28173 1726882755.42997: done getting next task for host managed_node2 28173 1726882755.43000: ^ task is: TASK: Create dummy interface {{ interface }} 28173 1726882755.43004: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882755.43008: getting variables 28173 1726882755.43009: in VariableManager get_vars() 28173 1726882755.43054: Calling all_inventory to load vars for managed_node2 28173 1726882755.43056: Calling groups_inventory to load vars for managed_node2 28173 1726882755.43059: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882755.43073: Calling all_plugins_play to load vars for managed_node2 28173 1726882755.43076: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882755.43079: Calling groups_plugins_play to load vars for managed_node2 28173 1726882755.43279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882755.43509: done with get_vars() 28173 1726882755.43519: done getting variables 28173 1726882755.43712: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882755.43932: variable 'interface' from source: set_fact 28173 1726882755.43952: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000270 28173 1726882755.43955: WORKER PROCESS EXITING TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:39:15 -0400 (0:00:00.026) 0:00:08.604 ****** 28173 1726882755.43971: entering _queue_task() for managed_node2/command 28173 1726882755.44535: worker is 1 (out of 1 available) 28173 1726882755.44545: exiting _queue_task() for managed_node2/command 28173 1726882755.44556: done queuing things up, now waiting for results queue to drain 28173 1726882755.44557: waiting for pending results... 28173 1726882755.44803: running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest0 28173 1726882755.44901: in run() - task 0e448fcc-3ce9-926c-8928-000000000271 28173 1726882755.44919: variable 'ansible_search_path' from source: unknown 28173 1726882755.44926: variable 'ansible_search_path' from source: unknown 28173 1726882755.44963: calling self._execute() 28173 1726882755.45048: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.45060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.45078: variable 'omit' from source: magic vars 28173 1726882755.45426: variable 'ansible_distribution_major_version' from source: facts 28173 1726882755.45444: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882755.45649: variable 'type' from source: set_fact 28173 1726882755.45659: variable 'state' from source: include params 28173 1726882755.45669: variable 'interface' from source: set_fact 28173 1726882755.45678: variable 'current_interfaces' from source: set_fact 28173 1726882755.45689: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 28173 1726882755.45695: when evaluation is False, skipping this task 28173 1726882755.45700: _execute() done 28173 1726882755.45705: dumping result to json 28173 1726882755.45711: done dumping result, returning 28173 1726882755.45719: done running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest0 [0e448fcc-3ce9-926c-8928-000000000271] 28173 1726882755.45728: sending task result for task 0e448fcc-3ce9-926c-8928-000000000271 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 28173 1726882755.45877: no more pending results, returning what we have 28173 1726882755.45881: results queue empty 28173 1726882755.45882: checking for any_errors_fatal 28173 1726882755.45889: done checking for any_errors_fatal 28173 1726882755.45889: checking for max_fail_percentage 28173 1726882755.45891: done checking for max_fail_percentage 28173 1726882755.45891: checking to see if all hosts have failed and the running result is not ok 28173 1726882755.45892: done checking to see if all hosts have failed 28173 1726882755.45893: getting the remaining hosts for this loop 28173 1726882755.45894: done getting the remaining hosts for this loop 28173 1726882755.45897: getting the next task for host managed_node2 28173 1726882755.45903: done getting next task for host managed_node2 28173 1726882755.45906: ^ task is: TASK: Delete dummy interface {{ interface }} 28173 1726882755.45909: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882755.45913: getting variables 28173 1726882755.45914: in VariableManager get_vars() 28173 1726882755.45952: Calling all_inventory to load vars for managed_node2 28173 1726882755.45954: Calling groups_inventory to load vars for managed_node2 28173 1726882755.45956: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882755.45973: Calling all_plugins_play to load vars for managed_node2 28173 1726882755.45976: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882755.45980: Calling groups_plugins_play to load vars for managed_node2 28173 1726882755.46156: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000271 28173 1726882755.46160: WORKER PROCESS EXITING 28173 1726882755.46178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882755.46666: done with get_vars() 28173 1726882755.46676: done getting variables 28173 1726882755.46734: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882755.46837: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:39:15 -0400 (0:00:00.028) 0:00:08.633 ****** 28173 1726882755.46863: entering _queue_task() for managed_node2/command 28173 1726882755.47070: worker is 1 (out of 1 available) 28173 1726882755.47082: exiting _queue_task() for managed_node2/command 28173 1726882755.47092: done queuing things up, now waiting for results queue to drain 28173 1726882755.47094: waiting for pending results... 28173 1726882755.47330: running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest0 28173 1726882755.47442: in run() - task 0e448fcc-3ce9-926c-8928-000000000272 28173 1726882755.47459: variable 'ansible_search_path' from source: unknown 28173 1726882755.47467: variable 'ansible_search_path' from source: unknown 28173 1726882755.47508: calling self._execute() 28173 1726882755.47594: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.47606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.47620: variable 'omit' from source: magic vars 28173 1726882755.47976: variable 'ansible_distribution_major_version' from source: facts 28173 1726882755.47994: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882755.48198: variable 'type' from source: set_fact 28173 1726882755.48205: variable 'state' from source: include params 28173 1726882755.48212: variable 'interface' from source: set_fact 28173 1726882755.48217: variable 'current_interfaces' from source: set_fact 28173 1726882755.48226: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 28173 1726882755.48230: when evaluation is False, skipping this task 28173 1726882755.48240: _execute() done 28173 1726882755.48245: dumping result to json 28173 1726882755.48250: done dumping result, returning 28173 1726882755.48257: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest0 [0e448fcc-3ce9-926c-8928-000000000272] 28173 1726882755.48267: sending task result for task 0e448fcc-3ce9-926c-8928-000000000272 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 28173 1726882755.48396: no more pending results, returning what we have 28173 1726882755.48400: results queue empty 28173 1726882755.48401: checking for any_errors_fatal 28173 1726882755.48411: done checking for any_errors_fatal 28173 1726882755.48412: checking for max_fail_percentage 28173 1726882755.48413: done checking for max_fail_percentage 28173 1726882755.48414: checking to see if all hosts have failed and the running result is not ok 28173 1726882755.48415: done checking to see if all hosts have failed 28173 1726882755.48416: getting the remaining hosts for this loop 28173 1726882755.48418: done getting the remaining hosts for this loop 28173 1726882755.48422: getting the next task for host managed_node2 28173 1726882755.48428: done getting next task for host managed_node2 28173 1726882755.48431: ^ task is: TASK: Create tap interface {{ interface }} 28173 1726882755.48435: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882755.48439: getting variables 28173 1726882755.48441: in VariableManager get_vars() 28173 1726882755.48489: Calling all_inventory to load vars for managed_node2 28173 1726882755.48492: Calling groups_inventory to load vars for managed_node2 28173 1726882755.48495: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882755.48508: Calling all_plugins_play to load vars for managed_node2 28173 1726882755.48511: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882755.48514: Calling groups_plugins_play to load vars for managed_node2 28173 1726882755.48724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882755.48950: done with get_vars() 28173 1726882755.48961: done getting variables 28173 1726882755.49133: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000272 28173 1726882755.49136: WORKER PROCESS EXITING 28173 1726882755.49169: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882755.49351: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:39:15 -0400 (0:00:00.025) 0:00:08.658 ****** 28173 1726882755.49381: entering _queue_task() for managed_node2/command 28173 1726882755.49590: worker is 1 (out of 1 available) 28173 1726882755.49603: exiting _queue_task() for managed_node2/command 28173 1726882755.49614: done queuing things up, now waiting for results queue to drain 28173 1726882755.49615: waiting for pending results... 28173 1726882755.49871: running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest0 28173 1726882755.49978: in run() - task 0e448fcc-3ce9-926c-8928-000000000273 28173 1726882755.50003: variable 'ansible_search_path' from source: unknown 28173 1726882755.50010: variable 'ansible_search_path' from source: unknown 28173 1726882755.50048: calling self._execute() 28173 1726882755.50141: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.50152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.50172: variable 'omit' from source: magic vars 28173 1726882755.50529: variable 'ansible_distribution_major_version' from source: facts 28173 1726882755.50552: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882755.50769: variable 'type' from source: set_fact 28173 1726882755.50781: variable 'state' from source: include params 28173 1726882755.50789: variable 'interface' from source: set_fact 28173 1726882755.50798: variable 'current_interfaces' from source: set_fact 28173 1726882755.50811: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 28173 1726882755.50821: when evaluation is False, skipping this task 28173 1726882755.50827: _execute() done 28173 1726882755.50833: dumping result to json 28173 1726882755.50841: done dumping result, returning 28173 1726882755.50849: done running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest0 [0e448fcc-3ce9-926c-8928-000000000273] 28173 1726882755.50859: sending task result for task 0e448fcc-3ce9-926c-8928-000000000273 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 28173 1726882755.50991: no more pending results, returning what we have 28173 1726882755.50995: results queue empty 28173 1726882755.50995: checking for any_errors_fatal 28173 1726882755.50999: done checking for any_errors_fatal 28173 1726882755.51000: checking for max_fail_percentage 28173 1726882755.51002: done checking for max_fail_percentage 28173 1726882755.51003: checking to see if all hosts have failed and the running result is not ok 28173 1726882755.51004: done checking to see if all hosts have failed 28173 1726882755.51004: getting the remaining hosts for this loop 28173 1726882755.51006: done getting the remaining hosts for this loop 28173 1726882755.51009: getting the next task for host managed_node2 28173 1726882755.51014: done getting next task for host managed_node2 28173 1726882755.51017: ^ task is: TASK: Delete tap interface {{ interface }} 28173 1726882755.51020: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882755.51024: getting variables 28173 1726882755.51025: in VariableManager get_vars() 28173 1726882755.51062: Calling all_inventory to load vars for managed_node2 28173 1726882755.51066: Calling groups_inventory to load vars for managed_node2 28173 1726882755.51069: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882755.51080: Calling all_plugins_play to load vars for managed_node2 28173 1726882755.51083: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882755.51086: Calling groups_plugins_play to load vars for managed_node2 28173 1726882755.51330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882755.51571: done with get_vars() 28173 1726882755.51694: done getting variables 28173 1726882755.51726: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000273 28173 1726882755.51729: WORKER PROCESS EXITING 28173 1726882755.51759: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882755.51915: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:39:15 -0400 (0:00:00.025) 0:00:08.683 ****** 28173 1726882755.51941: entering _queue_task() for managed_node2/command 28173 1726882755.52139: worker is 1 (out of 1 available) 28173 1726882755.52149: exiting _queue_task() for managed_node2/command 28173 1726882755.52160: done queuing things up, now waiting for results queue to drain 28173 1726882755.52161: waiting for pending results... 28173 1726882755.52402: running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest0 28173 1726882755.52509: in run() - task 0e448fcc-3ce9-926c-8928-000000000274 28173 1726882755.52525: variable 'ansible_search_path' from source: unknown 28173 1726882755.52533: variable 'ansible_search_path' from source: unknown 28173 1726882755.52575: calling self._execute() 28173 1726882755.52656: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.52675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.52690: variable 'omit' from source: magic vars 28173 1726882755.53033: variable 'ansible_distribution_major_version' from source: facts 28173 1726882755.53054: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882755.53269: variable 'type' from source: set_fact 28173 1726882755.53280: variable 'state' from source: include params 28173 1726882755.53288: variable 'interface' from source: set_fact 28173 1726882755.53296: variable 'current_interfaces' from source: set_fact 28173 1726882755.53308: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 28173 1726882755.53315: when evaluation is False, skipping this task 28173 1726882755.53326: _execute() done 28173 1726882755.53333: dumping result to json 28173 1726882755.53340: done dumping result, returning 28173 1726882755.53349: done running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest0 [0e448fcc-3ce9-926c-8928-000000000274] 28173 1726882755.53359: sending task result for task 0e448fcc-3ce9-926c-8928-000000000274 28173 1726882755.53457: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000274 28173 1726882755.53467: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 28173 1726882755.53518: no more pending results, returning what we have 28173 1726882755.53521: results queue empty 28173 1726882755.53522: checking for any_errors_fatal 28173 1726882755.53528: done checking for any_errors_fatal 28173 1726882755.53528: checking for max_fail_percentage 28173 1726882755.53530: done checking for max_fail_percentage 28173 1726882755.53531: checking to see if all hosts have failed and the running result is not ok 28173 1726882755.53532: done checking to see if all hosts have failed 28173 1726882755.53533: getting the remaining hosts for this loop 28173 1726882755.53534: done getting the remaining hosts for this loop 28173 1726882755.53537: getting the next task for host managed_node2 28173 1726882755.53545: done getting next task for host managed_node2 28173 1726882755.53548: ^ task is: TASK: Include the task 'assert_device_present.yml' 28173 1726882755.53551: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882755.53557: getting variables 28173 1726882755.53558: in VariableManager get_vars() 28173 1726882755.53598: Calling all_inventory to load vars for managed_node2 28173 1726882755.53600: Calling groups_inventory to load vars for managed_node2 28173 1726882755.53603: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882755.53615: Calling all_plugins_play to load vars for managed_node2 28173 1726882755.53618: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882755.53621: Calling groups_plugins_play to load vars for managed_node2 28173 1726882755.53811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882755.54036: done with get_vars() 28173 1726882755.54046: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:21 Friday 20 September 2024 21:39:15 -0400 (0:00:00.022) 0:00:08.705 ****** 28173 1726882755.54155: entering _queue_task() for managed_node2/include_tasks 28173 1726882755.54495: worker is 1 (out of 1 available) 28173 1726882755.54510: exiting _queue_task() for managed_node2/include_tasks 28173 1726882755.54521: done queuing things up, now waiting for results queue to drain 28173 1726882755.54523: waiting for pending results... 28173 1726882755.54760: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' 28173 1726882755.54858: in run() - task 0e448fcc-3ce9-926c-8928-00000000000e 28173 1726882755.54879: variable 'ansible_search_path' from source: unknown 28173 1726882755.54918: calling self._execute() 28173 1726882755.55009: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.55020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.55032: variable 'omit' from source: magic vars 28173 1726882755.55368: variable 'ansible_distribution_major_version' from source: facts 28173 1726882755.55393: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882755.55404: _execute() done 28173 1726882755.55411: dumping result to json 28173 1726882755.55417: done dumping result, returning 28173 1726882755.55426: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' [0e448fcc-3ce9-926c-8928-00000000000e] 28173 1726882755.55435: sending task result for task 0e448fcc-3ce9-926c-8928-00000000000e 28173 1726882755.55543: no more pending results, returning what we have 28173 1726882755.55548: in VariableManager get_vars() 28173 1726882755.55590: Calling all_inventory to load vars for managed_node2 28173 1726882755.55592: Calling groups_inventory to load vars for managed_node2 28173 1726882755.55594: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882755.55605: Calling all_plugins_play to load vars for managed_node2 28173 1726882755.55608: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882755.55610: Calling groups_plugins_play to load vars for managed_node2 28173 1726882755.55848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882755.56069: done with get_vars() 28173 1726882755.56076: variable 'ansible_search_path' from source: unknown 28173 1726882755.56093: we have included files to process 28173 1726882755.56094: generating all_blocks data 28173 1726882755.56096: done generating all_blocks data 28173 1726882755.56101: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28173 1726882755.56103: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28173 1726882755.56105: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28173 1726882755.56432: in VariableManager get_vars() 28173 1726882755.56451: done with get_vars() 28173 1726882755.56480: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000000e 28173 1726882755.56483: WORKER PROCESS EXITING 28173 1726882755.56568: done processing included file 28173 1726882755.56570: iterating over new_blocks loaded from include file 28173 1726882755.56572: in VariableManager get_vars() 28173 1726882755.56587: done with get_vars() 28173 1726882755.56588: filtering new block on tags 28173 1726882755.56604: done filtering new block on tags 28173 1726882755.56606: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 28173 1726882755.56610: extending task lists for all hosts with included blocks 28173 1726882755.59107: done extending task lists 28173 1726882755.59108: done processing included files 28173 1726882755.59109: results queue empty 28173 1726882755.59110: checking for any_errors_fatal 28173 1726882755.59112: done checking for any_errors_fatal 28173 1726882755.59113: checking for max_fail_percentage 28173 1726882755.59114: done checking for max_fail_percentage 28173 1726882755.59115: checking to see if all hosts have failed and the running result is not ok 28173 1726882755.59116: done checking to see if all hosts have failed 28173 1726882755.59122: getting the remaining hosts for this loop 28173 1726882755.59123: done getting the remaining hosts for this loop 28173 1726882755.59126: getting the next task for host managed_node2 28173 1726882755.59129: done getting next task for host managed_node2 28173 1726882755.59131: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28173 1726882755.59134: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882755.59136: getting variables 28173 1726882755.59137: in VariableManager get_vars() 28173 1726882755.59150: Calling all_inventory to load vars for managed_node2 28173 1726882755.59152: Calling groups_inventory to load vars for managed_node2 28173 1726882755.59154: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882755.59160: Calling all_plugins_play to load vars for managed_node2 28173 1726882755.59162: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882755.59168: Calling groups_plugins_play to load vars for managed_node2 28173 1726882755.59322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882755.59556: done with get_vars() 28173 1726882755.59568: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:39:15 -0400 (0:00:00.054) 0:00:08.760 ****** 28173 1726882755.59636: entering _queue_task() for managed_node2/include_tasks 28173 1726882755.59869: worker is 1 (out of 1 available) 28173 1726882755.59882: exiting _queue_task() for managed_node2/include_tasks 28173 1726882755.59897: done queuing things up, now waiting for results queue to drain 28173 1726882755.59899: waiting for pending results... 28173 1726882755.60158: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 28173 1726882755.60268: in run() - task 0e448fcc-3ce9-926c-8928-0000000003e0 28173 1726882755.60288: variable 'ansible_search_path' from source: unknown 28173 1726882755.60296: variable 'ansible_search_path' from source: unknown 28173 1726882755.60345: calling self._execute() 28173 1726882755.60436: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.60451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.60468: variable 'omit' from source: magic vars 28173 1726882755.60839: variable 'ansible_distribution_major_version' from source: facts 28173 1726882755.60857: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882755.60876: _execute() done 28173 1726882755.60887: dumping result to json 28173 1726882755.60894: done dumping result, returning 28173 1726882755.60904: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-926c-8928-0000000003e0] 28173 1726882755.60914: sending task result for task 0e448fcc-3ce9-926c-8928-0000000003e0 28173 1726882755.61032: no more pending results, returning what we have 28173 1726882755.61037: in VariableManager get_vars() 28173 1726882755.61085: Calling all_inventory to load vars for managed_node2 28173 1726882755.61088: Calling groups_inventory to load vars for managed_node2 28173 1726882755.61091: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882755.61103: Calling all_plugins_play to load vars for managed_node2 28173 1726882755.61107: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882755.61110: Calling groups_plugins_play to load vars for managed_node2 28173 1726882755.61311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882755.61526: done with get_vars() 28173 1726882755.61533: variable 'ansible_search_path' from source: unknown 28173 1726882755.61534: variable 'ansible_search_path' from source: unknown 28173 1726882755.61572: we have included files to process 28173 1726882755.61573: generating all_blocks data 28173 1726882755.61578: done generating all_blocks data 28173 1726882755.61581: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28173 1726882755.61582: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28173 1726882755.61585: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28173 1726882755.61906: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000003e0 28173 1726882755.61910: WORKER PROCESS EXITING 28173 1726882755.62002: done processing included file 28173 1726882755.62004: iterating over new_blocks loaded from include file 28173 1726882755.62005: in VariableManager get_vars() 28173 1726882755.62028: done with get_vars() 28173 1726882755.62029: filtering new block on tags 28173 1726882755.62043: done filtering new block on tags 28173 1726882755.62045: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 28173 1726882755.62049: extending task lists for all hosts with included blocks 28173 1726882755.62149: done extending task lists 28173 1726882755.62150: done processing included files 28173 1726882755.62151: results queue empty 28173 1726882755.62152: checking for any_errors_fatal 28173 1726882755.62155: done checking for any_errors_fatal 28173 1726882755.62156: checking for max_fail_percentage 28173 1726882755.62157: done checking for max_fail_percentage 28173 1726882755.62158: checking to see if all hosts have failed and the running result is not ok 28173 1726882755.62159: done checking to see if all hosts have failed 28173 1726882755.62159: getting the remaining hosts for this loop 28173 1726882755.62161: done getting the remaining hosts for this loop 28173 1726882755.62165: getting the next task for host managed_node2 28173 1726882755.62170: done getting next task for host managed_node2 28173 1726882755.62172: ^ task is: TASK: Get stat for interface {{ interface }} 28173 1726882755.62174: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882755.62177: getting variables 28173 1726882755.62178: in VariableManager get_vars() 28173 1726882755.62189: Calling all_inventory to load vars for managed_node2 28173 1726882755.62191: Calling groups_inventory to load vars for managed_node2 28173 1726882755.62192: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882755.62196: Calling all_plugins_play to load vars for managed_node2 28173 1726882755.62198: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882755.62201: Calling groups_plugins_play to load vars for managed_node2 28173 1726882755.62373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882755.62584: done with get_vars() 28173 1726882755.62593: done getting variables 28173 1726882755.62741: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:39:15 -0400 (0:00:00.031) 0:00:08.792 ****** 28173 1726882755.62774: entering _queue_task() for managed_node2/stat 28173 1726882755.62975: worker is 1 (out of 1 available) 28173 1726882755.62990: exiting _queue_task() for managed_node2/stat 28173 1726882755.63000: done queuing things up, now waiting for results queue to drain 28173 1726882755.63002: waiting for pending results... 28173 1726882755.63246: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 28173 1726882755.63352: in run() - task 0e448fcc-3ce9-926c-8928-0000000004ff 28173 1726882755.63374: variable 'ansible_search_path' from source: unknown 28173 1726882755.63383: variable 'ansible_search_path' from source: unknown 28173 1726882755.63421: calling self._execute() 28173 1726882755.63512: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.63523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.63542: variable 'omit' from source: magic vars 28173 1726882755.63892: variable 'ansible_distribution_major_version' from source: facts 28173 1726882755.63909: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882755.63920: variable 'omit' from source: magic vars 28173 1726882755.63971: variable 'omit' from source: magic vars 28173 1726882755.64070: variable 'interface' from source: set_fact 28173 1726882755.64099: variable 'omit' from source: magic vars 28173 1726882755.64141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882755.64179: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882755.64209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882755.64232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882755.64248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882755.64283: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882755.64293: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.64306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.64409: Set connection var ansible_pipelining to False 28173 1726882755.64421: Set connection var ansible_shell_type to sh 28173 1726882755.64435: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882755.64447: Set connection var ansible_timeout to 10 28173 1726882755.64457: Set connection var ansible_shell_executable to /bin/sh 28173 1726882755.64469: Set connection var ansible_connection to ssh 28173 1726882755.64494: variable 'ansible_shell_executable' from source: unknown 28173 1726882755.64503: variable 'ansible_connection' from source: unknown 28173 1726882755.64512: variable 'ansible_module_compression' from source: unknown 28173 1726882755.64523: variable 'ansible_shell_type' from source: unknown 28173 1726882755.64533: variable 'ansible_shell_executable' from source: unknown 28173 1726882755.64541: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882755.64549: variable 'ansible_pipelining' from source: unknown 28173 1726882755.64555: variable 'ansible_timeout' from source: unknown 28173 1726882755.64565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882755.64777: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882755.64793: variable 'omit' from source: magic vars 28173 1726882755.64803: starting attempt loop 28173 1726882755.64809: running the handler 28173 1726882755.64825: _low_level_execute_command(): starting 28173 1726882755.64837: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882755.65612: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882755.65629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.65645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.65667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.65712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.65732: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882755.65748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.65769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882755.65782: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882755.65795: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882755.65807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.65822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.65846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.65860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.65876: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882755.65892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.65977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882755.66001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882755.66020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882755.66171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882755.67820: stdout chunk (state=3): >>>/root <<< 28173 1726882755.67938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882755.68018: stderr chunk (state=3): >>><<< 28173 1726882755.68041: stdout chunk (state=3): >>><<< 28173 1726882755.68154: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882755.68158: _low_level_execute_command(): starting 28173 1726882755.68163: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882755.6807742-28641-227587783550352 `" && echo ansible-tmp-1726882755.6807742-28641-227587783550352="` echo /root/.ansible/tmp/ansible-tmp-1726882755.6807742-28641-227587783550352 `" ) && sleep 0' 28173 1726882755.68761: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882755.68778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.68792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.68809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.68884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.68896: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882755.69252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.69256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.69258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.69321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882755.69325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882755.69438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882755.71302: stdout chunk (state=3): >>>ansible-tmp-1726882755.6807742-28641-227587783550352=/root/.ansible/tmp/ansible-tmp-1726882755.6807742-28641-227587783550352 <<< 28173 1726882755.71418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882755.71491: stderr chunk (state=3): >>><<< 28173 1726882755.71495: stdout chunk (state=3): >>><<< 28173 1726882755.71578: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882755.6807742-28641-227587783550352=/root/.ansible/tmp/ansible-tmp-1726882755.6807742-28641-227587783550352 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882755.71582: variable 'ansible_module_compression' from source: unknown 28173 1726882755.71791: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28173 1726882755.71794: variable 'ansible_facts' from source: unknown 28173 1726882755.71796: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882755.6807742-28641-227587783550352/AnsiballZ_stat.py 28173 1726882755.71980: Sending initial data 28173 1726882755.71983: Sent initial data (153 bytes) 28173 1726882755.72999: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882755.73013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.73029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.73051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.73096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.73115: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882755.73130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.73149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882755.73161: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882755.73180: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882755.73194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.73209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.73231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.73245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.73257: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882755.73273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.73354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882755.73379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882755.73397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882755.73527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882755.75285: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882755.75382: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882755.75582: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmptcwdsfq0 /root/.ansible/tmp/ansible-tmp-1726882755.6807742-28641-227587783550352/AnsiballZ_stat.py <<< 28173 1726882755.75640: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882755.77650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882755.77881: stderr chunk (state=3): >>><<< 28173 1726882755.77884: stdout chunk (state=3): >>><<< 28173 1726882755.77887: done transferring module to remote 28173 1726882755.77889: _low_level_execute_command(): starting 28173 1726882755.77892: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882755.6807742-28641-227587783550352/ /root/.ansible/tmp/ansible-tmp-1726882755.6807742-28641-227587783550352/AnsiballZ_stat.py && sleep 0' 28173 1726882755.78520: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882755.78533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.78556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.78581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.78621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.78635: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882755.78649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.78675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882755.78690: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882755.78702: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882755.78715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.78729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.78745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.78758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.78774: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882755.78794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.78871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882755.78894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882755.78913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882755.79034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882755.80803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882755.80847: stderr chunk (state=3): >>><<< 28173 1726882755.80860: stdout chunk (state=3): >>><<< 28173 1726882755.80874: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882755.80877: _low_level_execute_command(): starting 28173 1726882755.80882: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882755.6807742-28641-227587783550352/AnsiballZ_stat.py && sleep 0' 28173 1726882755.81502: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.81506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.81533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882755.81537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.81539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.81617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882755.81623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882755.81730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882755.94803: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31739, "dev": 21, "nlink": 1, "atime": 1726882754.2633812, "mtime": 1726882754.2633812, "ctime": 1726882754.2633812, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28173 1726882755.95789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882755.95850: stderr chunk (state=3): >>><<< 28173 1726882755.95854: stdout chunk (state=3): >>><<< 28173 1726882755.96010: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31739, "dev": 21, "nlink": 1, "atime": 1726882754.2633812, "mtime": 1726882754.2633812, "ctime": 1726882754.2633812, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882755.96017: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882755.6807742-28641-227587783550352/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882755.96020: _low_level_execute_command(): starting 28173 1726882755.96022: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882755.6807742-28641-227587783550352/ > /dev/null 2>&1 && sleep 0' 28173 1726882755.96605: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882755.96619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.96634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.96652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.96705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.96718: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882755.96733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.96750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882755.96762: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882755.96780: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882755.96798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882755.96813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882755.96829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882755.96842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882755.96853: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882755.96868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882755.96951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882755.96976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882755.96998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882755.97138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882755.98930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882755.99007: stderr chunk (state=3): >>><<< 28173 1726882755.99017: stdout chunk (state=3): >>><<< 28173 1726882755.99275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882755.99278: handler run complete 28173 1726882755.99280: attempt loop complete, returning result 28173 1726882755.99282: _execute() done 28173 1726882755.99284: dumping result to json 28173 1726882755.99286: done dumping result, returning 28173 1726882755.99288: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 [0e448fcc-3ce9-926c-8928-0000000004ff] 28173 1726882755.99290: sending task result for task 0e448fcc-3ce9-926c-8928-0000000004ff 28173 1726882755.99369: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000004ff 28173 1726882755.99372: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882754.2633812, "block_size": 4096, "blocks": 0, "ctime": 1726882754.2633812, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31739, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1726882754.2633812, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 28173 1726882755.99475: no more pending results, returning what we have 28173 1726882755.99479: results queue empty 28173 1726882755.99480: checking for any_errors_fatal 28173 1726882755.99481: done checking for any_errors_fatal 28173 1726882755.99482: checking for max_fail_percentage 28173 1726882755.99485: done checking for max_fail_percentage 28173 1726882755.99486: checking to see if all hosts have failed and the running result is not ok 28173 1726882755.99487: done checking to see if all hosts have failed 28173 1726882755.99488: getting the remaining hosts for this loop 28173 1726882755.99490: done getting the remaining hosts for this loop 28173 1726882755.99493: getting the next task for host managed_node2 28173 1726882755.99501: done getting next task for host managed_node2 28173 1726882755.99503: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 28173 1726882755.99507: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882755.99512: getting variables 28173 1726882755.99513: in VariableManager get_vars() 28173 1726882755.99556: Calling all_inventory to load vars for managed_node2 28173 1726882755.99563: Calling groups_inventory to load vars for managed_node2 28173 1726882755.99568: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882755.99581: Calling all_plugins_play to load vars for managed_node2 28173 1726882755.99583: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882755.99586: Calling groups_plugins_play to load vars for managed_node2 28173 1726882755.99950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882756.00188: done with get_vars() 28173 1726882756.00199: done getting variables 28173 1726882756.00302: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 28173 1726882756.00432: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:39:16 -0400 (0:00:00.376) 0:00:09.169 ****** 28173 1726882756.00460: entering _queue_task() for managed_node2/assert 28173 1726882756.00462: Creating lock for assert 28173 1726882756.00724: worker is 1 (out of 1 available) 28173 1726882756.00736: exiting _queue_task() for managed_node2/assert 28173 1726882756.00753: done queuing things up, now waiting for results queue to drain 28173 1726882756.00754: waiting for pending results... 28173 1726882756.01008: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest0' 28173 1726882756.01118: in run() - task 0e448fcc-3ce9-926c-8928-0000000003e1 28173 1726882756.01137: variable 'ansible_search_path' from source: unknown 28173 1726882756.01144: variable 'ansible_search_path' from source: unknown 28173 1726882756.01193: calling self._execute() 28173 1726882756.01387: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882756.01397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882756.01420: variable 'omit' from source: magic vars 28173 1726882756.01771: variable 'ansible_distribution_major_version' from source: facts 28173 1726882756.01788: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882756.01800: variable 'omit' from source: magic vars 28173 1726882756.01840: variable 'omit' from source: magic vars 28173 1726882756.01941: variable 'interface' from source: set_fact 28173 1726882756.01973: variable 'omit' from source: magic vars 28173 1726882756.02013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882756.02051: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882756.02086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882756.02109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882756.02126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882756.02159: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882756.02177: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882756.02188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882756.02295: Set connection var ansible_pipelining to False 28173 1726882756.02302: Set connection var ansible_shell_type to sh 28173 1726882756.02314: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882756.02325: Set connection var ansible_timeout to 10 28173 1726882756.02334: Set connection var ansible_shell_executable to /bin/sh 28173 1726882756.02341: Set connection var ansible_connection to ssh 28173 1726882756.02366: variable 'ansible_shell_executable' from source: unknown 28173 1726882756.02376: variable 'ansible_connection' from source: unknown 28173 1726882756.02385: variable 'ansible_module_compression' from source: unknown 28173 1726882756.02401: variable 'ansible_shell_type' from source: unknown 28173 1726882756.02409: variable 'ansible_shell_executable' from source: unknown 28173 1726882756.02417: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882756.02425: variable 'ansible_pipelining' from source: unknown 28173 1726882756.02432: variable 'ansible_timeout' from source: unknown 28173 1726882756.02440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882756.02588: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882756.02605: variable 'omit' from source: magic vars 28173 1726882756.02623: starting attempt loop 28173 1726882756.02629: running the handler 28173 1726882756.02770: variable 'interface_stat' from source: set_fact 28173 1726882756.02793: Evaluated conditional (interface_stat.stat.exists): True 28173 1726882756.02803: handler run complete 28173 1726882756.02820: attempt loop complete, returning result 28173 1726882756.02835: _execute() done 28173 1726882756.02843: dumping result to json 28173 1726882756.02849: done dumping result, returning 28173 1726882756.02860: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest0' [0e448fcc-3ce9-926c-8928-0000000003e1] 28173 1726882756.02873: sending task result for task 0e448fcc-3ce9-926c-8928-0000000003e1 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28173 1726882756.03086: no more pending results, returning what we have 28173 1726882756.03089: results queue empty 28173 1726882756.03090: checking for any_errors_fatal 28173 1726882756.03096: done checking for any_errors_fatal 28173 1726882756.03097: checking for max_fail_percentage 28173 1726882756.03099: done checking for max_fail_percentage 28173 1726882756.03099: checking to see if all hosts have failed and the running result is not ok 28173 1726882756.03100: done checking to see if all hosts have failed 28173 1726882756.03101: getting the remaining hosts for this loop 28173 1726882756.03102: done getting the remaining hosts for this loop 28173 1726882756.03106: getting the next task for host managed_node2 28173 1726882756.03114: done getting next task for host managed_node2 28173 1726882756.03119: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28173 1726882756.03122: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882756.03138: getting variables 28173 1726882756.03140: in VariableManager get_vars() 28173 1726882756.03174: Calling all_inventory to load vars for managed_node2 28173 1726882756.03176: Calling groups_inventory to load vars for managed_node2 28173 1726882756.03179: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882756.03189: Calling all_plugins_play to load vars for managed_node2 28173 1726882756.03192: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882756.03195: Calling groups_plugins_play to load vars for managed_node2 28173 1726882756.03374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882756.03617: done with get_vars() 28173 1726882756.03627: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:39:16 -0400 (0:00:00.034) 0:00:09.203 ****** 28173 1726882756.03884: entering _queue_task() for managed_node2/include_tasks 28173 1726882756.03897: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000003e1 28173 1726882756.03900: WORKER PROCESS EXITING 28173 1726882756.04161: worker is 1 (out of 1 available) 28173 1726882756.04173: exiting _queue_task() for managed_node2/include_tasks 28173 1726882756.04184: done queuing things up, now waiting for results queue to drain 28173 1726882756.04185: waiting for pending results... 28173 1726882756.04440: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28173 1726882756.04578: in run() - task 0e448fcc-3ce9-926c-8928-000000000016 28173 1726882756.04598: variable 'ansible_search_path' from source: unknown 28173 1726882756.04605: variable 'ansible_search_path' from source: unknown 28173 1726882756.04646: calling self._execute() 28173 1726882756.04732: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882756.04746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882756.04758: variable 'omit' from source: magic vars 28173 1726882756.05112: variable 'ansible_distribution_major_version' from source: facts 28173 1726882756.05135: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882756.05146: _execute() done 28173 1726882756.05154: dumping result to json 28173 1726882756.05161: done dumping result, returning 28173 1726882756.05176: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-926c-8928-000000000016] 28173 1726882756.05187: sending task result for task 0e448fcc-3ce9-926c-8928-000000000016 28173 1726882756.05318: no more pending results, returning what we have 28173 1726882756.05323: in VariableManager get_vars() 28173 1726882756.05368: Calling all_inventory to load vars for managed_node2 28173 1726882756.05371: Calling groups_inventory to load vars for managed_node2 28173 1726882756.05373: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882756.05386: Calling all_plugins_play to load vars for managed_node2 28173 1726882756.05389: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882756.05392: Calling groups_plugins_play to load vars for managed_node2 28173 1726882756.05588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882756.05850: done with get_vars() 28173 1726882756.05858: variable 'ansible_search_path' from source: unknown 28173 1726882756.05859: variable 'ansible_search_path' from source: unknown 28173 1726882756.05908: we have included files to process 28173 1726882756.05909: generating all_blocks data 28173 1726882756.05911: done generating all_blocks data 28173 1726882756.05916: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28173 1726882756.05917: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28173 1726882756.05919: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28173 1726882756.06316: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000016 28173 1726882756.06321: WORKER PROCESS EXITING 28173 1726882756.06828: done processing included file 28173 1726882756.06830: iterating over new_blocks loaded from include file 28173 1726882756.06831: in VariableManager get_vars() 28173 1726882756.06853: done with get_vars() 28173 1726882756.06855: filtering new block on tags 28173 1726882756.06878: done filtering new block on tags 28173 1726882756.06880: in VariableManager get_vars() 28173 1726882756.06902: done with get_vars() 28173 1726882756.06903: filtering new block on tags 28173 1726882756.06924: done filtering new block on tags 28173 1726882756.06926: in VariableManager get_vars() 28173 1726882756.06949: done with get_vars() 28173 1726882756.06950: filtering new block on tags 28173 1726882756.06970: done filtering new block on tags 28173 1726882756.06972: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28173 1726882756.06983: extending task lists for all hosts with included blocks 28173 1726882756.07880: done extending task lists 28173 1726882756.07881: done processing included files 28173 1726882756.07882: results queue empty 28173 1726882756.07883: checking for any_errors_fatal 28173 1726882756.07885: done checking for any_errors_fatal 28173 1726882756.07886: checking for max_fail_percentage 28173 1726882756.07887: done checking for max_fail_percentage 28173 1726882756.07887: checking to see if all hosts have failed and the running result is not ok 28173 1726882756.07888: done checking to see if all hosts have failed 28173 1726882756.07889: getting the remaining hosts for this loop 28173 1726882756.07890: done getting the remaining hosts for this loop 28173 1726882756.07892: getting the next task for host managed_node2 28173 1726882756.07896: done getting next task for host managed_node2 28173 1726882756.07899: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28173 1726882756.07902: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882756.07911: getting variables 28173 1726882756.07912: in VariableManager get_vars() 28173 1726882756.07926: Calling all_inventory to load vars for managed_node2 28173 1726882756.07928: Calling groups_inventory to load vars for managed_node2 28173 1726882756.07930: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882756.07934: Calling all_plugins_play to load vars for managed_node2 28173 1726882756.07937: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882756.07940: Calling groups_plugins_play to load vars for managed_node2 28173 1726882756.08124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882756.08351: done with get_vars() 28173 1726882756.08360: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:39:16 -0400 (0:00:00.045) 0:00:09.248 ****** 28173 1726882756.08433: entering _queue_task() for managed_node2/setup 28173 1726882756.08644: worker is 1 (out of 1 available) 28173 1726882756.08655: exiting _queue_task() for managed_node2/setup 28173 1726882756.08667: done queuing things up, now waiting for results queue to drain 28173 1726882756.08668: waiting for pending results... 28173 1726882756.08918: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28173 1726882756.09084: in run() - task 0e448fcc-3ce9-926c-8928-000000000517 28173 1726882756.09105: variable 'ansible_search_path' from source: unknown 28173 1726882756.09115: variable 'ansible_search_path' from source: unknown 28173 1726882756.09152: calling self._execute() 28173 1726882756.09237: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882756.09247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882756.09260: variable 'omit' from source: magic vars 28173 1726882756.09614: variable 'ansible_distribution_major_version' from source: facts 28173 1726882756.09630: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882756.09858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882756.12297: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882756.12375: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882756.12416: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882756.12462: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882756.12497: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882756.12587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882756.12621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882756.12662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882756.12717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882756.12736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882756.12804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882756.12833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882756.12872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882756.12921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882756.12940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882756.13113: variable '__network_required_facts' from source: role '' defaults 28173 1726882756.13131: variable 'ansible_facts' from source: unknown 28173 1726882756.13240: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28173 1726882756.13248: when evaluation is False, skipping this task 28173 1726882756.13255: _execute() done 28173 1726882756.13262: dumping result to json 28173 1726882756.13273: done dumping result, returning 28173 1726882756.13284: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-926c-8928-000000000517] 28173 1726882756.13299: sending task result for task 0e448fcc-3ce9-926c-8928-000000000517 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882756.13439: no more pending results, returning what we have 28173 1726882756.13443: results queue empty 28173 1726882756.13444: checking for any_errors_fatal 28173 1726882756.13446: done checking for any_errors_fatal 28173 1726882756.13446: checking for max_fail_percentage 28173 1726882756.13448: done checking for max_fail_percentage 28173 1726882756.13449: checking to see if all hosts have failed and the running result is not ok 28173 1726882756.13450: done checking to see if all hosts have failed 28173 1726882756.13451: getting the remaining hosts for this loop 28173 1726882756.13452: done getting the remaining hosts for this loop 28173 1726882756.13456: getting the next task for host managed_node2 28173 1726882756.13466: done getting next task for host managed_node2 28173 1726882756.13471: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28173 1726882756.13476: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882756.13489: getting variables 28173 1726882756.13491: in VariableManager get_vars() 28173 1726882756.13531: Calling all_inventory to load vars for managed_node2 28173 1726882756.13533: Calling groups_inventory to load vars for managed_node2 28173 1726882756.13536: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882756.13545: Calling all_plugins_play to load vars for managed_node2 28173 1726882756.13548: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882756.13551: Calling groups_plugins_play to load vars for managed_node2 28173 1726882756.13732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882756.14009: done with get_vars() 28173 1726882756.14020: done getting variables 28173 1726882756.14224: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000517 28173 1726882756.14227: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:39:16 -0400 (0:00:00.058) 0:00:09.306 ****** 28173 1726882756.14249: entering _queue_task() for managed_node2/stat 28173 1726882756.14616: worker is 1 (out of 1 available) 28173 1726882756.14626: exiting _queue_task() for managed_node2/stat 28173 1726882756.14646: done queuing things up, now waiting for results queue to drain 28173 1726882756.14648: waiting for pending results... 28173 1726882756.14911: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28173 1726882756.15053: in run() - task 0e448fcc-3ce9-926c-8928-000000000519 28173 1726882756.15073: variable 'ansible_search_path' from source: unknown 28173 1726882756.15088: variable 'ansible_search_path' from source: unknown 28173 1726882756.15125: calling self._execute() 28173 1726882756.15209: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882756.15219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882756.15230: variable 'omit' from source: magic vars 28173 1726882756.15576: variable 'ansible_distribution_major_version' from source: facts 28173 1726882756.15593: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882756.15768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882756.16029: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882756.16083: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882756.16118: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882756.16154: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882756.16239: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882756.16269: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882756.16310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882756.16340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882756.16435: variable '__network_is_ostree' from source: set_fact 28173 1726882756.16445: Evaluated conditional (not __network_is_ostree is defined): False 28173 1726882756.16451: when evaluation is False, skipping this task 28173 1726882756.16456: _execute() done 28173 1726882756.16461: dumping result to json 28173 1726882756.16471: done dumping result, returning 28173 1726882756.16480: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-926c-8928-000000000519] 28173 1726882756.16489: sending task result for task 0e448fcc-3ce9-926c-8928-000000000519 28173 1726882756.16589: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000519 28173 1726882756.16596: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28173 1726882756.16657: no more pending results, returning what we have 28173 1726882756.16661: results queue empty 28173 1726882756.16662: checking for any_errors_fatal 28173 1726882756.16672: done checking for any_errors_fatal 28173 1726882756.16673: checking for max_fail_percentage 28173 1726882756.16675: done checking for max_fail_percentage 28173 1726882756.16676: checking to see if all hosts have failed and the running result is not ok 28173 1726882756.16677: done checking to see if all hosts have failed 28173 1726882756.16677: getting the remaining hosts for this loop 28173 1726882756.16679: done getting the remaining hosts for this loop 28173 1726882756.16682: getting the next task for host managed_node2 28173 1726882756.16688: done getting next task for host managed_node2 28173 1726882756.16691: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28173 1726882756.16696: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882756.16711: getting variables 28173 1726882756.16712: in VariableManager get_vars() 28173 1726882756.16749: Calling all_inventory to load vars for managed_node2 28173 1726882756.16752: Calling groups_inventory to load vars for managed_node2 28173 1726882756.16754: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882756.16766: Calling all_plugins_play to load vars for managed_node2 28173 1726882756.16770: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882756.16773: Calling groups_plugins_play to load vars for managed_node2 28173 1726882756.16955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882756.17203: done with get_vars() 28173 1726882756.17213: done getting variables 28173 1726882756.17270: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:39:16 -0400 (0:00:00.031) 0:00:09.338 ****** 28173 1726882756.17425: entering _queue_task() for managed_node2/set_fact 28173 1726882756.17701: worker is 1 (out of 1 available) 28173 1726882756.17711: exiting _queue_task() for managed_node2/set_fact 28173 1726882756.17737: done queuing things up, now waiting for results queue to drain 28173 1726882756.17738: waiting for pending results... 28173 1726882756.18013: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28173 1726882756.18159: in run() - task 0e448fcc-3ce9-926c-8928-00000000051a 28173 1726882756.18190: variable 'ansible_search_path' from source: unknown 28173 1726882756.18198: variable 'ansible_search_path' from source: unknown 28173 1726882756.18233: calling self._execute() 28173 1726882756.18332: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882756.18344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882756.18356: variable 'omit' from source: magic vars 28173 1726882756.18711: variable 'ansible_distribution_major_version' from source: facts 28173 1726882756.18738: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882756.18907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882756.19238: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882756.19310: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882756.19352: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882756.19454: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882756.19561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882756.19610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882756.19629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882756.19647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882756.19732: variable '__network_is_ostree' from source: set_fact 28173 1726882756.19738: Evaluated conditional (not __network_is_ostree is defined): False 28173 1726882756.19741: when evaluation is False, skipping this task 28173 1726882756.19743: _execute() done 28173 1726882756.19745: dumping result to json 28173 1726882756.19748: done dumping result, returning 28173 1726882756.19754: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-926c-8928-00000000051a] 28173 1726882756.19759: sending task result for task 0e448fcc-3ce9-926c-8928-00000000051a 28173 1726882756.19837: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000051a 28173 1726882756.19840: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28173 1726882756.19895: no more pending results, returning what we have 28173 1726882756.19898: results queue empty 28173 1726882756.19899: checking for any_errors_fatal 28173 1726882756.19903: done checking for any_errors_fatal 28173 1726882756.19903: checking for max_fail_percentage 28173 1726882756.19905: done checking for max_fail_percentage 28173 1726882756.19905: checking to see if all hosts have failed and the running result is not ok 28173 1726882756.19906: done checking to see if all hosts have failed 28173 1726882756.19907: getting the remaining hosts for this loop 28173 1726882756.19908: done getting the remaining hosts for this loop 28173 1726882756.19912: getting the next task for host managed_node2 28173 1726882756.19918: done getting next task for host managed_node2 28173 1726882756.19922: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28173 1726882756.19926: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882756.19937: getting variables 28173 1726882756.19938: in VariableManager get_vars() 28173 1726882756.19974: Calling all_inventory to load vars for managed_node2 28173 1726882756.19976: Calling groups_inventory to load vars for managed_node2 28173 1726882756.19978: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882756.19984: Calling all_plugins_play to load vars for managed_node2 28173 1726882756.19986: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882756.19987: Calling groups_plugins_play to load vars for managed_node2 28173 1726882756.20123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882756.20251: done with get_vars() 28173 1726882756.20258: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:39:16 -0400 (0:00:00.028) 0:00:09.367 ****** 28173 1726882756.20325: entering _queue_task() for managed_node2/service_facts 28173 1726882756.20326: Creating lock for service_facts 28173 1726882756.20489: worker is 1 (out of 1 available) 28173 1726882756.20499: exiting _queue_task() for managed_node2/service_facts 28173 1726882756.20511: done queuing things up, now waiting for results queue to drain 28173 1726882756.20512: waiting for pending results... 28173 1726882756.20662: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28173 1726882756.20746: in run() - task 0e448fcc-3ce9-926c-8928-00000000051c 28173 1726882756.20757: variable 'ansible_search_path' from source: unknown 28173 1726882756.20760: variable 'ansible_search_path' from source: unknown 28173 1726882756.20787: calling self._execute() 28173 1726882756.20846: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882756.20849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882756.20852: variable 'omit' from source: magic vars 28173 1726882756.21086: variable 'ansible_distribution_major_version' from source: facts 28173 1726882756.21096: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882756.21102: variable 'omit' from source: magic vars 28173 1726882756.21143: variable 'omit' from source: magic vars 28173 1726882756.21169: variable 'omit' from source: magic vars 28173 1726882756.21198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882756.21220: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882756.21235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882756.21248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882756.21257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882756.21286: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882756.21289: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882756.21291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882756.21350: Set connection var ansible_pipelining to False 28173 1726882756.21354: Set connection var ansible_shell_type to sh 28173 1726882756.21359: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882756.21371: Set connection var ansible_timeout to 10 28173 1726882756.21374: Set connection var ansible_shell_executable to /bin/sh 28173 1726882756.21376: Set connection var ansible_connection to ssh 28173 1726882756.21394: variable 'ansible_shell_executable' from source: unknown 28173 1726882756.21397: variable 'ansible_connection' from source: unknown 28173 1726882756.21402: variable 'ansible_module_compression' from source: unknown 28173 1726882756.21404: variable 'ansible_shell_type' from source: unknown 28173 1726882756.21406: variable 'ansible_shell_executable' from source: unknown 28173 1726882756.21408: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882756.21411: variable 'ansible_pipelining' from source: unknown 28173 1726882756.21413: variable 'ansible_timeout' from source: unknown 28173 1726882756.21414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882756.21569: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882756.21575: variable 'omit' from source: magic vars 28173 1726882756.21580: starting attempt loop 28173 1726882756.21582: running the handler 28173 1726882756.21592: _low_level_execute_command(): starting 28173 1726882756.21641: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882756.22359: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882756.22385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882756.22516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882756.24180: stdout chunk (state=3): >>>/root <<< 28173 1726882756.24317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882756.24360: stderr chunk (state=3): >>><<< 28173 1726882756.24374: stdout chunk (state=3): >>><<< 28173 1726882756.24393: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882756.24418: _low_level_execute_command(): starting 28173 1726882756.24430: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882756.243984-28667-155655227047064 `" && echo ansible-tmp-1726882756.243984-28667-155655227047064="` echo /root/.ansible/tmp/ansible-tmp-1726882756.243984-28667-155655227047064 `" ) && sleep 0' 28173 1726882756.25050: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882756.25054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882756.25098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882756.25102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882756.25110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882756.25147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882756.25150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882756.25267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882756.27154: stdout chunk (state=3): >>>ansible-tmp-1726882756.243984-28667-155655227047064=/root/.ansible/tmp/ansible-tmp-1726882756.243984-28667-155655227047064 <<< 28173 1726882756.27263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882756.27321: stderr chunk (state=3): >>><<< 28173 1726882756.27325: stdout chunk (state=3): >>><<< 28173 1726882756.27337: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882756.243984-28667-155655227047064=/root/.ansible/tmp/ansible-tmp-1726882756.243984-28667-155655227047064 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882756.27373: variable 'ansible_module_compression' from source: unknown 28173 1726882756.27407: ANSIBALLZ: Using lock for service_facts 28173 1726882756.27410: ANSIBALLZ: Acquiring lock 28173 1726882756.27413: ANSIBALLZ: Lock acquired: 140243972637088 28173 1726882756.27415: ANSIBALLZ: Creating module 28173 1726882756.35819: ANSIBALLZ: Writing module into payload 28173 1726882756.35897: ANSIBALLZ: Writing module 28173 1726882756.35917: ANSIBALLZ: Renaming module 28173 1726882756.35920: ANSIBALLZ: Done creating module 28173 1726882756.35934: variable 'ansible_facts' from source: unknown 28173 1726882756.35985: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882756.243984-28667-155655227047064/AnsiballZ_service_facts.py 28173 1726882756.36092: Sending initial data 28173 1726882756.36095: Sent initial data (161 bytes) 28173 1726882756.36779: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882756.36782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882756.36820: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882756.36823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882756.36826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882756.36871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882756.36890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882756.36892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882756.36992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882756.38814: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882756.38909: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882756.39007: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpeskv08xg /root/.ansible/tmp/ansible-tmp-1726882756.243984-28667-155655227047064/AnsiballZ_service_facts.py <<< 28173 1726882756.39107: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882756.40531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882756.40656: stderr chunk (state=3): >>><<< 28173 1726882756.40660: stdout chunk (state=3): >>><<< 28173 1726882756.40680: done transferring module to remote 28173 1726882756.40690: _low_level_execute_command(): starting 28173 1726882756.40695: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882756.243984-28667-155655227047064/ /root/.ansible/tmp/ansible-tmp-1726882756.243984-28667-155655227047064/AnsiballZ_service_facts.py && sleep 0' 28173 1726882756.41161: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882756.41174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882756.41182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882756.41194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882756.41225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882756.41231: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882756.41240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882756.41249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882756.41255: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882756.41267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882756.41273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882756.41277: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882756.41288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882756.41335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882756.41352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882756.41357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882756.41475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882756.43245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882756.43287: stderr chunk (state=3): >>><<< 28173 1726882756.43290: stdout chunk (state=3): >>><<< 28173 1726882756.43301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882756.43303: _low_level_execute_command(): starting 28173 1726882756.43308: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882756.243984-28667-155655227047064/AnsiballZ_service_facts.py && sleep 0' 28173 1726882756.43706: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882756.43711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882756.43757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882756.43760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882756.43763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882756.43767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882756.43821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882756.43824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882756.43931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882757.79039: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 28173 1726882757.79069: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28173 1726882757.80289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882757.80363: stderr chunk (state=3): >>><<< 28173 1726882757.80368: stdout chunk (state=3): >>><<< 28173 1726882757.80395: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882757.81019: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882756.243984-28667-155655227047064/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882757.81028: _low_level_execute_command(): starting 28173 1726882757.81033: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882756.243984-28667-155655227047064/ > /dev/null 2>&1 && sleep 0' 28173 1726882757.81685: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882757.81695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882757.81705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882757.81720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882757.81758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882757.81766: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882757.81779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882757.81792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882757.81799: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882757.81805: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882757.81813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882757.81825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882757.81833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882757.81838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882757.81845: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882757.81854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882757.81947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882757.81955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882757.81958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882757.82087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882757.83924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882757.83996: stderr chunk (state=3): >>><<< 28173 1726882757.84007: stdout chunk (state=3): >>><<< 28173 1726882757.84275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882757.84278: handler run complete 28173 1726882757.84280: variable 'ansible_facts' from source: unknown 28173 1726882757.84367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882757.84857: variable 'ansible_facts' from source: unknown 28173 1726882757.84992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882757.85201: attempt loop complete, returning result 28173 1726882757.85211: _execute() done 28173 1726882757.85217: dumping result to json 28173 1726882757.85288: done dumping result, returning 28173 1726882757.85301: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-926c-8928-00000000051c] 28173 1726882757.85311: sending task result for task 0e448fcc-3ce9-926c-8928-00000000051c ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882757.86329: no more pending results, returning what we have 28173 1726882757.86331: results queue empty 28173 1726882757.86332: checking for any_errors_fatal 28173 1726882757.86335: done checking for any_errors_fatal 28173 1726882757.86336: checking for max_fail_percentage 28173 1726882757.86337: done checking for max_fail_percentage 28173 1726882757.86338: checking to see if all hosts have failed and the running result is not ok 28173 1726882757.86339: done checking to see if all hosts have failed 28173 1726882757.86340: getting the remaining hosts for this loop 28173 1726882757.86341: done getting the remaining hosts for this loop 28173 1726882757.86345: getting the next task for host managed_node2 28173 1726882757.86351: done getting next task for host managed_node2 28173 1726882757.86354: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28173 1726882757.86359: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882757.86369: getting variables 28173 1726882757.86371: in VariableManager get_vars() 28173 1726882757.86406: Calling all_inventory to load vars for managed_node2 28173 1726882757.86408: Calling groups_inventory to load vars for managed_node2 28173 1726882757.86411: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882757.86422: Calling all_plugins_play to load vars for managed_node2 28173 1726882757.86424: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882757.86427: Calling groups_plugins_play to load vars for managed_node2 28173 1726882757.86802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882757.87575: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000051c 28173 1726882757.87578: WORKER PROCESS EXITING 28173 1726882757.87766: done with get_vars() 28173 1726882757.87785: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:39:17 -0400 (0:00:01.675) 0:00:11.043 ****** 28173 1726882757.87880: entering _queue_task() for managed_node2/package_facts 28173 1726882757.87882: Creating lock for package_facts 28173 1726882757.88152: worker is 1 (out of 1 available) 28173 1726882757.88163: exiting _queue_task() for managed_node2/package_facts 28173 1726882757.88176: done queuing things up, now waiting for results queue to drain 28173 1726882757.88178: waiting for pending results... 28173 1726882757.88457: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28173 1726882757.88616: in run() - task 0e448fcc-3ce9-926c-8928-00000000051d 28173 1726882757.88637: variable 'ansible_search_path' from source: unknown 28173 1726882757.88645: variable 'ansible_search_path' from source: unknown 28173 1726882757.88690: calling self._execute() 28173 1726882757.88779: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882757.88789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882757.88801: variable 'omit' from source: magic vars 28173 1726882757.89188: variable 'ansible_distribution_major_version' from source: facts 28173 1726882757.89215: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882757.89226: variable 'omit' from source: magic vars 28173 1726882757.89307: variable 'omit' from source: magic vars 28173 1726882757.89349: variable 'omit' from source: magic vars 28173 1726882757.89397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882757.89441: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882757.89463: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882757.89491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882757.89508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882757.89551: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882757.89560: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882757.89571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882757.89692: Set connection var ansible_pipelining to False 28173 1726882757.89703: Set connection var ansible_shell_type to sh 28173 1726882757.89720: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882757.89733: Set connection var ansible_timeout to 10 28173 1726882757.89753: Set connection var ansible_shell_executable to /bin/sh 28173 1726882757.89767: Set connection var ansible_connection to ssh 28173 1726882757.89794: variable 'ansible_shell_executable' from source: unknown 28173 1726882757.89803: variable 'ansible_connection' from source: unknown 28173 1726882757.89814: variable 'ansible_module_compression' from source: unknown 28173 1726882757.89824: variable 'ansible_shell_type' from source: unknown 28173 1726882757.89832: variable 'ansible_shell_executable' from source: unknown 28173 1726882757.89840: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882757.89856: variable 'ansible_pipelining' from source: unknown 28173 1726882757.89869: variable 'ansible_timeout' from source: unknown 28173 1726882757.89880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882757.90093: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882757.90107: variable 'omit' from source: magic vars 28173 1726882757.90115: starting attempt loop 28173 1726882757.90121: running the handler 28173 1726882757.90134: _low_level_execute_command(): starting 28173 1726882757.90148: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882757.90916: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882757.90929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882757.90950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882757.90970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882757.91011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882757.91026: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882757.91039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882757.91067: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882757.91081: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882757.91093: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882757.91105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882757.91119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882757.91138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882757.91152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882757.91171: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882757.91187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882757.91266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882757.91296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882757.91311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882757.91446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882757.93126: stdout chunk (state=3): >>>/root <<< 28173 1726882757.93282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882757.93320: stderr chunk (state=3): >>><<< 28173 1726882757.93323: stdout chunk (state=3): >>><<< 28173 1726882757.93423: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882757.93427: _low_level_execute_command(): starting 28173 1726882757.93430: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882757.9333951-28714-230925880383894 `" && echo ansible-tmp-1726882757.9333951-28714-230925880383894="` echo /root/.ansible/tmp/ansible-tmp-1726882757.9333951-28714-230925880383894 `" ) && sleep 0' 28173 1726882757.93995: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882757.93998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882757.94042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882757.94045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882757.94048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882757.94111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882757.94115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882757.94244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882757.96190: stdout chunk (state=3): >>>ansible-tmp-1726882757.9333951-28714-230925880383894=/root/.ansible/tmp/ansible-tmp-1726882757.9333951-28714-230925880383894 <<< 28173 1726882757.96305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882757.96369: stderr chunk (state=3): >>><<< 28173 1726882757.96627: stdout chunk (state=3): >>><<< 28173 1726882757.96631: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882757.9333951-28714-230925880383894=/root/.ansible/tmp/ansible-tmp-1726882757.9333951-28714-230925880383894 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882757.96634: variable 'ansible_module_compression' from source: unknown 28173 1726882757.96636: ANSIBALLZ: Using lock for package_facts 28173 1726882757.96638: ANSIBALLZ: Acquiring lock 28173 1726882757.96640: ANSIBALLZ: Lock acquired: 140243972850992 28173 1726882757.96642: ANSIBALLZ: Creating module 28173 1726882758.16845: ANSIBALLZ: Writing module into payload 28173 1726882758.16957: ANSIBALLZ: Writing module 28173 1726882758.16985: ANSIBALLZ: Renaming module 28173 1726882758.16993: ANSIBALLZ: Done creating module 28173 1726882758.17024: variable 'ansible_facts' from source: unknown 28173 1726882758.17156: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882757.9333951-28714-230925880383894/AnsiballZ_package_facts.py 28173 1726882758.17277: Sending initial data 28173 1726882758.17280: Sent initial data (162 bytes) 28173 1726882758.18002: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882758.18006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882758.18042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882758.18047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882758.18049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882758.18052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882758.18097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882758.18108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882758.18228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882758.20074: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882758.20170: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882758.20261: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpvb9rfpfw /root/.ansible/tmp/ansible-tmp-1726882757.9333951-28714-230925880383894/AnsiballZ_package_facts.py <<< 28173 1726882758.20369: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882758.22343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882758.22435: stderr chunk (state=3): >>><<< 28173 1726882758.22439: stdout chunk (state=3): >>><<< 28173 1726882758.22452: done transferring module to remote 28173 1726882758.22462: _low_level_execute_command(): starting 28173 1726882758.22471: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882757.9333951-28714-230925880383894/ /root/.ansible/tmp/ansible-tmp-1726882757.9333951-28714-230925880383894/AnsiballZ_package_facts.py && sleep 0' 28173 1726882758.22904: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882758.22909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882758.22943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882758.22948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 28173 1726882758.22956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882758.22967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882758.22977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882758.22982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882758.23038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882758.23041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882758.23049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882758.23160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882758.24922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882758.24971: stderr chunk (state=3): >>><<< 28173 1726882758.24974: stdout chunk (state=3): >>><<< 28173 1726882758.24988: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882758.24991: _low_level_execute_command(): starting 28173 1726882758.24995: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882757.9333951-28714-230925880383894/AnsiballZ_package_facts.py && sleep 0' 28173 1726882758.25420: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882758.25425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882758.25455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882758.25472: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882758.25527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882758.25535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882758.25651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882758.72097: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [<<< 28173 1726882758.72157: stdout chunk (state=3): >>>{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "<<< 28173 1726882758.72161: stdout chunk (state=3): >>>rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 28173 1726882758.72208: stdout chunk (state=3): >>>libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1<<< 28173 1726882758.72212: stdout chunk (state=3): >>>.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", <<< 28173 1726882758.72228: stdout chunk (state=3): >>>"release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-bas<<< 28173 1726882758.72241: stdout chunk (state=3): >>>e-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch<<< 28173 1726882758.72246: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source"<<< 28173 1726882758.72249: stdout chunk (state=3): >>>: "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "ar<<< 28173 1726882758.72276: stdout chunk (state=3): >>>ch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", <<< 28173 1726882758.72280: stdout chunk (state=3): >>>"release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "<<< 28173 1726882758.72287: stdout chunk (state=3): >>>version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64",<<< 28173 1726882758.72290: stdout chunk (state=3): >>> "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "re<<< 28173 1726882758.72293: stdout chunk (state=3): >>>lease": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", <<< 28173 1726882758.72296: stdout chunk (state=3): >>>"source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28173 1726882758.73833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882758.73837: stdout chunk (state=3): >>><<< 28173 1726882758.73839: stderr chunk (state=3): >>><<< 28173 1726882758.73982: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882758.77501: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882757.9333951-28714-230925880383894/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882758.77531: _low_level_execute_command(): starting 28173 1726882758.77541: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882757.9333951-28714-230925880383894/ > /dev/null 2>&1 && sleep 0' 28173 1726882758.78260: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882758.78276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882758.78290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882758.78306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882758.78369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882758.78384: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882758.78397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882758.78413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882758.78424: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882758.78434: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882758.78445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882758.78457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882758.78487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882758.78501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882758.78514: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882758.78527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882758.78612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882758.78634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882758.78649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882758.78781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882758.80684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882758.80712: stderr chunk (state=3): >>><<< 28173 1726882758.80715: stdout chunk (state=3): >>><<< 28173 1726882758.80875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882758.80879: handler run complete 28173 1726882758.81778: variable 'ansible_facts' from source: unknown 28173 1726882758.82305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882758.88660: variable 'ansible_facts' from source: unknown 28173 1726882758.89181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882758.90059: attempt loop complete, returning result 28173 1726882758.90078: _execute() done 28173 1726882758.90085: dumping result to json 28173 1726882758.90352: done dumping result, returning 28173 1726882758.90368: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-926c-8928-00000000051d] 28173 1726882758.90379: sending task result for task 0e448fcc-3ce9-926c-8928-00000000051d 28173 1726882758.92773: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000051d 28173 1726882758.92776: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882758.92887: no more pending results, returning what we have 28173 1726882758.92890: results queue empty 28173 1726882758.92891: checking for any_errors_fatal 28173 1726882758.92895: done checking for any_errors_fatal 28173 1726882758.92896: checking for max_fail_percentage 28173 1726882758.92898: done checking for max_fail_percentage 28173 1726882758.92898: checking to see if all hosts have failed and the running result is not ok 28173 1726882758.92899: done checking to see if all hosts have failed 28173 1726882758.92900: getting the remaining hosts for this loop 28173 1726882758.92902: done getting the remaining hosts for this loop 28173 1726882758.92905: getting the next task for host managed_node2 28173 1726882758.92911: done getting next task for host managed_node2 28173 1726882758.92915: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28173 1726882758.92918: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882758.92927: getting variables 28173 1726882758.92928: in VariableManager get_vars() 28173 1726882758.92973: Calling all_inventory to load vars for managed_node2 28173 1726882758.92976: Calling groups_inventory to load vars for managed_node2 28173 1726882758.92979: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882758.92988: Calling all_plugins_play to load vars for managed_node2 28173 1726882758.92991: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882758.92994: Calling groups_plugins_play to load vars for managed_node2 28173 1726882758.94610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882758.96340: done with get_vars() 28173 1726882758.96367: done getting variables 28173 1726882758.96422: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:39:18 -0400 (0:00:01.085) 0:00:12.129 ****** 28173 1726882758.96460: entering _queue_task() for managed_node2/debug 28173 1726882758.96755: worker is 1 (out of 1 available) 28173 1726882758.96773: exiting _queue_task() for managed_node2/debug 28173 1726882758.96785: done queuing things up, now waiting for results queue to drain 28173 1726882758.96787: waiting for pending results... 28173 1726882758.97079: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28173 1726882758.97216: in run() - task 0e448fcc-3ce9-926c-8928-000000000017 28173 1726882758.97239: variable 'ansible_search_path' from source: unknown 28173 1726882758.97246: variable 'ansible_search_path' from source: unknown 28173 1726882758.97289: calling self._execute() 28173 1726882758.97389: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882758.97401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882758.97414: variable 'omit' from source: magic vars 28173 1726882758.97815: variable 'ansible_distribution_major_version' from source: facts 28173 1726882758.97833: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882758.97844: variable 'omit' from source: magic vars 28173 1726882758.97909: variable 'omit' from source: magic vars 28173 1726882758.98022: variable 'network_provider' from source: set_fact 28173 1726882758.98044: variable 'omit' from source: magic vars 28173 1726882758.98099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882758.98136: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882758.98161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882758.98192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882758.98211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882758.98242: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882758.98251: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882758.98260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882758.98367: Set connection var ansible_pipelining to False 28173 1726882758.98370: Set connection var ansible_shell_type to sh 28173 1726882758.98376: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882758.98383: Set connection var ansible_timeout to 10 28173 1726882758.98389: Set connection var ansible_shell_executable to /bin/sh 28173 1726882758.98392: Set connection var ansible_connection to ssh 28173 1726882758.98438: variable 'ansible_shell_executable' from source: unknown 28173 1726882758.98442: variable 'ansible_connection' from source: unknown 28173 1726882758.98445: variable 'ansible_module_compression' from source: unknown 28173 1726882758.98448: variable 'ansible_shell_type' from source: unknown 28173 1726882758.98450: variable 'ansible_shell_executable' from source: unknown 28173 1726882758.98452: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882758.98455: variable 'ansible_pipelining' from source: unknown 28173 1726882758.98457: variable 'ansible_timeout' from source: unknown 28173 1726882758.98459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882758.98560: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882758.98570: variable 'omit' from source: magic vars 28173 1726882758.98573: starting attempt loop 28173 1726882758.98577: running the handler 28173 1726882758.98622: handler run complete 28173 1726882758.98635: attempt loop complete, returning result 28173 1726882758.98638: _execute() done 28173 1726882758.98641: dumping result to json 28173 1726882758.98644: done dumping result, returning 28173 1726882758.98648: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-926c-8928-000000000017] 28173 1726882758.98654: sending task result for task 0e448fcc-3ce9-926c-8928-000000000017 28173 1726882758.98738: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000017 28173 1726882758.98741: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28173 1726882758.98809: no more pending results, returning what we have 28173 1726882758.98812: results queue empty 28173 1726882758.98812: checking for any_errors_fatal 28173 1726882758.98821: done checking for any_errors_fatal 28173 1726882758.98822: checking for max_fail_percentage 28173 1726882758.98824: done checking for max_fail_percentage 28173 1726882758.98824: checking to see if all hosts have failed and the running result is not ok 28173 1726882758.98825: done checking to see if all hosts have failed 28173 1726882758.98826: getting the remaining hosts for this loop 28173 1726882758.98828: done getting the remaining hosts for this loop 28173 1726882758.98832: getting the next task for host managed_node2 28173 1726882758.98837: done getting next task for host managed_node2 28173 1726882758.98841: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28173 1726882758.98844: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882758.98855: getting variables 28173 1726882758.98857: in VariableManager get_vars() 28173 1726882758.98896: Calling all_inventory to load vars for managed_node2 28173 1726882758.98899: Calling groups_inventory to load vars for managed_node2 28173 1726882758.98901: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882758.98909: Calling all_plugins_play to load vars for managed_node2 28173 1726882758.98912: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882758.98914: Calling groups_plugins_play to load vars for managed_node2 28173 1726882758.99712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882759.00991: done with get_vars() 28173 1726882759.01013: done getting variables 28173 1726882759.01076: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:39:19 -0400 (0:00:00.046) 0:00:12.175 ****** 28173 1726882759.01110: entering _queue_task() for managed_node2/fail 28173 1726882759.01399: worker is 1 (out of 1 available) 28173 1726882759.01410: exiting _queue_task() for managed_node2/fail 28173 1726882759.01423: done queuing things up, now waiting for results queue to drain 28173 1726882759.01425: waiting for pending results... 28173 1726882759.01785: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28173 1726882759.01908: in run() - task 0e448fcc-3ce9-926c-8928-000000000018 28173 1726882759.01912: variable 'ansible_search_path' from source: unknown 28173 1726882759.01915: variable 'ansible_search_path' from source: unknown 28173 1726882759.01918: calling self._execute() 28173 1726882759.02001: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882759.02004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882759.02013: variable 'omit' from source: magic vars 28173 1726882759.02308: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.02323: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882759.02524: variable 'network_state' from source: role '' defaults 28173 1726882759.02528: Evaluated conditional (network_state != {}): False 28173 1726882759.02531: when evaluation is False, skipping this task 28173 1726882759.02534: _execute() done 28173 1726882759.02537: dumping result to json 28173 1726882759.02539: done dumping result, returning 28173 1726882759.02542: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-926c-8928-000000000018] 28173 1726882759.02544: sending task result for task 0e448fcc-3ce9-926c-8928-000000000018 28173 1726882759.02615: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000018 28173 1726882759.02618: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882759.02690: no more pending results, returning what we have 28173 1726882759.02694: results queue empty 28173 1726882759.02695: checking for any_errors_fatal 28173 1726882759.02702: done checking for any_errors_fatal 28173 1726882759.02703: checking for max_fail_percentage 28173 1726882759.02705: done checking for max_fail_percentage 28173 1726882759.02705: checking to see if all hosts have failed and the running result is not ok 28173 1726882759.02706: done checking to see if all hosts have failed 28173 1726882759.02707: getting the remaining hosts for this loop 28173 1726882759.02708: done getting the remaining hosts for this loop 28173 1726882759.02711: getting the next task for host managed_node2 28173 1726882759.02717: done getting next task for host managed_node2 28173 1726882759.02721: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28173 1726882759.02724: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882759.02738: getting variables 28173 1726882759.02740: in VariableManager get_vars() 28173 1726882759.02783: Calling all_inventory to load vars for managed_node2 28173 1726882759.02785: Calling groups_inventory to load vars for managed_node2 28173 1726882759.02787: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882759.02799: Calling all_plugins_play to load vars for managed_node2 28173 1726882759.02802: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882759.02805: Calling groups_plugins_play to load vars for managed_node2 28173 1726882759.04093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882759.05027: done with get_vars() 28173 1726882759.05044: done getting variables 28173 1726882759.05088: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:39:19 -0400 (0:00:00.040) 0:00:12.215 ****** 28173 1726882759.05111: entering _queue_task() for managed_node2/fail 28173 1726882759.05301: worker is 1 (out of 1 available) 28173 1726882759.05315: exiting _queue_task() for managed_node2/fail 28173 1726882759.05327: done queuing things up, now waiting for results queue to drain 28173 1726882759.05329: waiting for pending results... 28173 1726882759.05501: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28173 1726882759.05643: in run() - task 0e448fcc-3ce9-926c-8928-000000000019 28173 1726882759.05669: variable 'ansible_search_path' from source: unknown 28173 1726882759.05673: variable 'ansible_search_path' from source: unknown 28173 1726882759.05754: calling self._execute() 28173 1726882759.05799: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882759.05803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882759.05813: variable 'omit' from source: magic vars 28173 1726882759.06175: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.06187: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882759.06386: variable 'network_state' from source: role '' defaults 28173 1726882759.06395: Evaluated conditional (network_state != {}): False 28173 1726882759.06398: when evaluation is False, skipping this task 28173 1726882759.06401: _execute() done 28173 1726882759.06404: dumping result to json 28173 1726882759.06406: done dumping result, returning 28173 1726882759.06413: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-926c-8928-000000000019] 28173 1726882759.06424: sending task result for task 0e448fcc-3ce9-926c-8928-000000000019 28173 1726882759.06510: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000019 28173 1726882759.06513: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882759.06580: no more pending results, returning what we have 28173 1726882759.06583: results queue empty 28173 1726882759.06584: checking for any_errors_fatal 28173 1726882759.06590: done checking for any_errors_fatal 28173 1726882759.06591: checking for max_fail_percentage 28173 1726882759.06593: done checking for max_fail_percentage 28173 1726882759.06593: checking to see if all hosts have failed and the running result is not ok 28173 1726882759.06594: done checking to see if all hosts have failed 28173 1726882759.06595: getting the remaining hosts for this loop 28173 1726882759.06597: done getting the remaining hosts for this loop 28173 1726882759.06600: getting the next task for host managed_node2 28173 1726882759.06605: done getting next task for host managed_node2 28173 1726882759.06608: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28173 1726882759.06611: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882759.06624: getting variables 28173 1726882759.06625: in VariableManager get_vars() 28173 1726882759.06661: Calling all_inventory to load vars for managed_node2 28173 1726882759.06673: Calling groups_inventory to load vars for managed_node2 28173 1726882759.06677: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882759.06686: Calling all_plugins_play to load vars for managed_node2 28173 1726882759.06688: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882759.06692: Calling groups_plugins_play to load vars for managed_node2 28173 1726882759.07672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882759.09243: done with get_vars() 28173 1726882759.09259: done getting variables 28173 1726882759.09302: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:39:19 -0400 (0:00:00.042) 0:00:12.257 ****** 28173 1726882759.09327: entering _queue_task() for managed_node2/fail 28173 1726882759.09523: worker is 1 (out of 1 available) 28173 1726882759.09533: exiting _queue_task() for managed_node2/fail 28173 1726882759.09545: done queuing things up, now waiting for results queue to drain 28173 1726882759.09546: waiting for pending results... 28173 1726882759.09725: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28173 1726882759.09814: in run() - task 0e448fcc-3ce9-926c-8928-00000000001a 28173 1726882759.09826: variable 'ansible_search_path' from source: unknown 28173 1726882759.09830: variable 'ansible_search_path' from source: unknown 28173 1726882759.09861: calling self._execute() 28173 1726882759.09929: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882759.09933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882759.09941: variable 'omit' from source: magic vars 28173 1726882759.10210: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.10220: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882759.10340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882759.12438: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882759.12521: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882759.12560: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882759.12609: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882759.12641: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882759.12735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.12774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.12814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.12858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.12879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.12985: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.13006: Evaluated conditional (ansible_distribution_major_version | int > 9): False 28173 1726882759.13018: when evaluation is False, skipping this task 28173 1726882759.13025: _execute() done 28173 1726882759.13031: dumping result to json 28173 1726882759.13038: done dumping result, returning 28173 1726882759.13048: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-926c-8928-00000000001a] 28173 1726882759.13058: sending task result for task 0e448fcc-3ce9-926c-8928-00000000001a skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 28173 1726882759.13206: no more pending results, returning what we have 28173 1726882759.13210: results queue empty 28173 1726882759.13211: checking for any_errors_fatal 28173 1726882759.13217: done checking for any_errors_fatal 28173 1726882759.13218: checking for max_fail_percentage 28173 1726882759.13219: done checking for max_fail_percentage 28173 1726882759.13220: checking to see if all hosts have failed and the running result is not ok 28173 1726882759.13221: done checking to see if all hosts have failed 28173 1726882759.13222: getting the remaining hosts for this loop 28173 1726882759.13224: done getting the remaining hosts for this loop 28173 1726882759.13227: getting the next task for host managed_node2 28173 1726882759.13234: done getting next task for host managed_node2 28173 1726882759.13238: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28173 1726882759.13241: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882759.13255: getting variables 28173 1726882759.13257: in VariableManager get_vars() 28173 1726882759.13303: Calling all_inventory to load vars for managed_node2 28173 1726882759.13306: Calling groups_inventory to load vars for managed_node2 28173 1726882759.13308: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882759.13319: Calling all_plugins_play to load vars for managed_node2 28173 1726882759.13322: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882759.13325: Calling groups_plugins_play to load vars for managed_node2 28173 1726882759.14341: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000001a 28173 1726882759.14344: WORKER PROCESS EXITING 28173 1726882759.15100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882759.16805: done with get_vars() 28173 1726882759.16831: done getting variables 28173 1726882759.16929: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:39:19 -0400 (0:00:00.076) 0:00:12.334 ****** 28173 1726882759.16967: entering _queue_task() for managed_node2/dnf 28173 1726882759.17240: worker is 1 (out of 1 available) 28173 1726882759.17252: exiting _queue_task() for managed_node2/dnf 28173 1726882759.17269: done queuing things up, now waiting for results queue to drain 28173 1726882759.17271: waiting for pending results... 28173 1726882759.17547: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28173 1726882759.17680: in run() - task 0e448fcc-3ce9-926c-8928-00000000001b 28173 1726882759.17707: variable 'ansible_search_path' from source: unknown 28173 1726882759.17718: variable 'ansible_search_path' from source: unknown 28173 1726882759.17758: calling self._execute() 28173 1726882759.17854: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882759.17868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882759.17886: variable 'omit' from source: magic vars 28173 1726882759.18266: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.18283: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882759.18475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882759.21072: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882759.21140: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882759.21203: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882759.21240: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882759.21272: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882759.21353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.21388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.21427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.21472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.21493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.21611: variable 'ansible_distribution' from source: facts 28173 1726882759.21627: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.21646: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28173 1726882759.21771: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882759.21909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.21940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.21978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.22021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.22041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.22094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.22122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.22151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.22204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.22224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.22268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.22304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.22333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.22380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.22405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.22568: variable 'network_connections' from source: task vars 28173 1726882759.22585: variable 'interface' from source: set_fact 28173 1726882759.22660: variable 'interface' from source: set_fact 28173 1726882759.22675: variable 'interface' from source: set_fact 28173 1726882759.22747: variable 'interface' from source: set_fact 28173 1726882759.22824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882759.22994: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882759.23036: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882759.23078: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882759.23106: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882759.23165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882759.23192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882759.23225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.23251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882759.23316: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882759.23567: variable 'network_connections' from source: task vars 28173 1726882759.23577: variable 'interface' from source: set_fact 28173 1726882759.23647: variable 'interface' from source: set_fact 28173 1726882759.23658: variable 'interface' from source: set_fact 28173 1726882759.23728: variable 'interface' from source: set_fact 28173 1726882759.23771: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28173 1726882759.23778: when evaluation is False, skipping this task 28173 1726882759.23785: _execute() done 28173 1726882759.23792: dumping result to json 28173 1726882759.23799: done dumping result, returning 28173 1726882759.23817: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-00000000001b] 28173 1726882759.23827: sending task result for task 0e448fcc-3ce9-926c-8928-00000000001b skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28173 1726882759.23977: no more pending results, returning what we have 28173 1726882759.23981: results queue empty 28173 1726882759.23982: checking for any_errors_fatal 28173 1726882759.23989: done checking for any_errors_fatal 28173 1726882759.23990: checking for max_fail_percentage 28173 1726882759.23991: done checking for max_fail_percentage 28173 1726882759.23992: checking to see if all hosts have failed and the running result is not ok 28173 1726882759.23993: done checking to see if all hosts have failed 28173 1726882759.23994: getting the remaining hosts for this loop 28173 1726882759.23995: done getting the remaining hosts for this loop 28173 1726882759.23999: getting the next task for host managed_node2 28173 1726882759.24005: done getting next task for host managed_node2 28173 1726882759.24010: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28173 1726882759.24013: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882759.24027: getting variables 28173 1726882759.24028: in VariableManager get_vars() 28173 1726882759.24075: Calling all_inventory to load vars for managed_node2 28173 1726882759.24078: Calling groups_inventory to load vars for managed_node2 28173 1726882759.24080: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882759.24090: Calling all_plugins_play to load vars for managed_node2 28173 1726882759.24093: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882759.24097: Calling groups_plugins_play to load vars for managed_node2 28173 1726882759.25103: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000001b 28173 1726882759.25107: WORKER PROCESS EXITING 28173 1726882759.25922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882759.27628: done with get_vars() 28173 1726882759.27652: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28173 1726882759.27737: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:39:19 -0400 (0:00:00.108) 0:00:12.442 ****** 28173 1726882759.27773: entering _queue_task() for managed_node2/yum 28173 1726882759.27775: Creating lock for yum 28173 1726882759.28066: worker is 1 (out of 1 available) 28173 1726882759.28078: exiting _queue_task() for managed_node2/yum 28173 1726882759.28090: done queuing things up, now waiting for results queue to drain 28173 1726882759.28092: waiting for pending results... 28173 1726882759.28381: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28173 1726882759.28529: in run() - task 0e448fcc-3ce9-926c-8928-00000000001c 28173 1726882759.28553: variable 'ansible_search_path' from source: unknown 28173 1726882759.28561: variable 'ansible_search_path' from source: unknown 28173 1726882759.28606: calling self._execute() 28173 1726882759.28709: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882759.28726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882759.28738: variable 'omit' from source: magic vars 28173 1726882759.29097: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.29116: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882759.29303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882759.31700: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882759.31781: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882759.31825: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882759.31862: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882759.31904: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882759.31985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.32025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.32055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.32103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.32123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.32207: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.32230: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28173 1726882759.32236: when evaluation is False, skipping this task 28173 1726882759.32242: _execute() done 28173 1726882759.32247: dumping result to json 28173 1726882759.32253: done dumping result, returning 28173 1726882759.32261: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-00000000001c] 28173 1726882759.32276: sending task result for task 0e448fcc-3ce9-926c-8928-00000000001c skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28173 1726882759.32423: no more pending results, returning what we have 28173 1726882759.32427: results queue empty 28173 1726882759.32428: checking for any_errors_fatal 28173 1726882759.32434: done checking for any_errors_fatal 28173 1726882759.32435: checking for max_fail_percentage 28173 1726882759.32437: done checking for max_fail_percentage 28173 1726882759.32438: checking to see if all hosts have failed and the running result is not ok 28173 1726882759.32438: done checking to see if all hosts have failed 28173 1726882759.32439: getting the remaining hosts for this loop 28173 1726882759.32441: done getting the remaining hosts for this loop 28173 1726882759.32444: getting the next task for host managed_node2 28173 1726882759.32451: done getting next task for host managed_node2 28173 1726882759.32455: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28173 1726882759.32459: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882759.32474: getting variables 28173 1726882759.32476: in VariableManager get_vars() 28173 1726882759.32520: Calling all_inventory to load vars for managed_node2 28173 1726882759.32523: Calling groups_inventory to load vars for managed_node2 28173 1726882759.32525: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882759.32536: Calling all_plugins_play to load vars for managed_node2 28173 1726882759.32539: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882759.32542: Calling groups_plugins_play to load vars for managed_node2 28173 1726882759.33603: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000001c 28173 1726882759.33606: WORKER PROCESS EXITING 28173 1726882759.37949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882759.39697: done with get_vars() 28173 1726882759.39722: done getting variables 28173 1726882759.39772: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:39:19 -0400 (0:00:00.120) 0:00:12.562 ****** 28173 1726882759.39807: entering _queue_task() for managed_node2/fail 28173 1726882759.40141: worker is 1 (out of 1 available) 28173 1726882759.40153: exiting _queue_task() for managed_node2/fail 28173 1726882759.40167: done queuing things up, now waiting for results queue to drain 28173 1726882759.40168: waiting for pending results... 28173 1726882759.40454: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28173 1726882759.40595: in run() - task 0e448fcc-3ce9-926c-8928-00000000001d 28173 1726882759.40620: variable 'ansible_search_path' from source: unknown 28173 1726882759.40629: variable 'ansible_search_path' from source: unknown 28173 1726882759.40672: calling self._execute() 28173 1726882759.40774: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882759.40791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882759.40807: variable 'omit' from source: magic vars 28173 1726882759.41241: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.41262: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882759.41399: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882759.41610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882759.43990: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882759.44074: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882759.44117: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882759.44157: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882759.44200: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882759.44290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.44329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.44362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.44421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.44439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.44491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.44523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.44552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.44605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.44625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.44670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.44698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.44736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.44781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.44799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.44989: variable 'network_connections' from source: task vars 28173 1726882759.45005: variable 'interface' from source: set_fact 28173 1726882759.45087: variable 'interface' from source: set_fact 28173 1726882759.45099: variable 'interface' from source: set_fact 28173 1726882759.45172: variable 'interface' from source: set_fact 28173 1726882759.45253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882759.45440: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882759.45490: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882759.45524: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882759.45555: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882759.45609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882759.45634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882759.45661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.45703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882759.45774: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882759.46029: variable 'network_connections' from source: task vars 28173 1726882759.46039: variable 'interface' from source: set_fact 28173 1726882759.46103: variable 'interface' from source: set_fact 28173 1726882759.46113: variable 'interface' from source: set_fact 28173 1726882759.46181: variable 'interface' from source: set_fact 28173 1726882759.46223: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28173 1726882759.46231: when evaluation is False, skipping this task 28173 1726882759.46239: _execute() done 28173 1726882759.46247: dumping result to json 28173 1726882759.46255: done dumping result, returning 28173 1726882759.46266: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-00000000001d] 28173 1726882759.46289: sending task result for task 0e448fcc-3ce9-926c-8928-00000000001d skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28173 1726882759.46432: no more pending results, returning what we have 28173 1726882759.46436: results queue empty 28173 1726882759.46436: checking for any_errors_fatal 28173 1726882759.46442: done checking for any_errors_fatal 28173 1726882759.46442: checking for max_fail_percentage 28173 1726882759.46444: done checking for max_fail_percentage 28173 1726882759.46445: checking to see if all hosts have failed and the running result is not ok 28173 1726882759.46446: done checking to see if all hosts have failed 28173 1726882759.46447: getting the remaining hosts for this loop 28173 1726882759.46448: done getting the remaining hosts for this loop 28173 1726882759.46453: getting the next task for host managed_node2 28173 1726882759.46460: done getting next task for host managed_node2 28173 1726882759.46466: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28173 1726882759.46469: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882759.46482: getting variables 28173 1726882759.46484: in VariableManager get_vars() 28173 1726882759.46522: Calling all_inventory to load vars for managed_node2 28173 1726882759.46525: Calling groups_inventory to load vars for managed_node2 28173 1726882759.46527: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882759.46536: Calling all_plugins_play to load vars for managed_node2 28173 1726882759.46539: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882759.46541: Calling groups_plugins_play to load vars for managed_node2 28173 1726882759.47583: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000001d 28173 1726882759.47587: WORKER PROCESS EXITING 28173 1726882759.48222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882759.50623: done with get_vars() 28173 1726882759.50649: done getting variables 28173 1726882759.50711: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:39:19 -0400 (0:00:00.109) 0:00:12.671 ****** 28173 1726882759.50745: entering _queue_task() for managed_node2/package 28173 1726882759.51248: worker is 1 (out of 1 available) 28173 1726882759.51258: exiting _queue_task() for managed_node2/package 28173 1726882759.51271: done queuing things up, now waiting for results queue to drain 28173 1726882759.51273: waiting for pending results... 28173 1726882759.52350: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28173 1726882759.52510: in run() - task 0e448fcc-3ce9-926c-8928-00000000001e 28173 1726882759.52529: variable 'ansible_search_path' from source: unknown 28173 1726882759.52536: variable 'ansible_search_path' from source: unknown 28173 1726882759.52577: calling self._execute() 28173 1726882759.52759: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882759.52841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882759.52855: variable 'omit' from source: magic vars 28173 1726882759.53572: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.53688: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882759.54035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882759.54283: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882759.54326: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882759.54487: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882759.54529: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882759.54662: variable 'network_packages' from source: role '' defaults 28173 1726882759.54778: variable '__network_provider_setup' from source: role '' defaults 28173 1726882759.54798: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882759.54870: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882759.54884: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882759.54951: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882759.55143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882759.57589: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882759.57661: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882759.57705: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882759.57748: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882759.57781: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882759.57860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.57897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.57969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.58015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.58180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.58226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.58248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.58279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.58414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.58430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.58936: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28173 1726882759.59158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.59190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.59218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.59269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.59368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.59573: variable 'ansible_python' from source: facts 28173 1726882759.59602: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28173 1726882759.59792: variable '__network_wpa_supplicant_required' from source: role '' defaults 28173 1726882759.59875: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28173 1726882759.60092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.60238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.60268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.60361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.60446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.60497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.60563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.60677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.60720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.60738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.61110: variable 'network_connections' from source: task vars 28173 1726882759.61120: variable 'interface' from source: set_fact 28173 1726882759.61336: variable 'interface' from source: set_fact 28173 1726882759.61349: variable 'interface' from source: set_fact 28173 1726882759.61568: variable 'interface' from source: set_fact 28173 1726882759.61650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882759.61758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882759.61791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.61875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882759.61925: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882759.62548: variable 'network_connections' from source: task vars 28173 1726882759.62607: variable 'interface' from source: set_fact 28173 1726882759.62823: variable 'interface' from source: set_fact 28173 1726882759.62837: variable 'interface' from source: set_fact 28173 1726882759.63049: variable 'interface' from source: set_fact 28173 1726882759.63115: variable '__network_packages_default_wireless' from source: role '' defaults 28173 1726882759.63332: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882759.64042: variable 'network_connections' from source: task vars 28173 1726882759.64052: variable 'interface' from source: set_fact 28173 1726882759.64127: variable 'interface' from source: set_fact 28173 1726882759.64138: variable 'interface' from source: set_fact 28173 1726882759.64296: variable 'interface' from source: set_fact 28173 1726882759.64332: variable '__network_packages_default_team' from source: role '' defaults 28173 1726882759.64530: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882759.65182: variable 'network_connections' from source: task vars 28173 1726882759.65192: variable 'interface' from source: set_fact 28173 1726882759.65313: variable 'interface' from source: set_fact 28173 1726882759.65328: variable 'interface' from source: set_fact 28173 1726882759.65394: variable 'interface' from source: set_fact 28173 1726882759.65651: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882759.65719: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882759.65761: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882759.65922: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882759.66347: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28173 1726882759.67331: variable 'network_connections' from source: task vars 28173 1726882759.67341: variable 'interface' from source: set_fact 28173 1726882759.67519: variable 'interface' from source: set_fact 28173 1726882759.67530: variable 'interface' from source: set_fact 28173 1726882759.67705: variable 'interface' from source: set_fact 28173 1726882759.67724: variable 'ansible_distribution' from source: facts 28173 1726882759.67732: variable '__network_rh_distros' from source: role '' defaults 28173 1726882759.67740: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.67766: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28173 1726882759.68049: variable 'ansible_distribution' from source: facts 28173 1726882759.68139: variable '__network_rh_distros' from source: role '' defaults 28173 1726882759.68149: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.68171: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28173 1726882759.68447: variable 'ansible_distribution' from source: facts 28173 1726882759.68575: variable '__network_rh_distros' from source: role '' defaults 28173 1726882759.68586: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.68626: variable 'network_provider' from source: set_fact 28173 1726882759.68646: variable 'ansible_facts' from source: unknown 28173 1726882759.70068: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28173 1726882759.70193: when evaluation is False, skipping this task 28173 1726882759.70201: _execute() done 28173 1726882759.70208: dumping result to json 28173 1726882759.70215: done dumping result, returning 28173 1726882759.70226: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-926c-8928-00000000001e] 28173 1726882759.70305: sending task result for task 0e448fcc-3ce9-926c-8928-00000000001e skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28173 1726882759.70460: no more pending results, returning what we have 28173 1726882759.70466: results queue empty 28173 1726882759.70467: checking for any_errors_fatal 28173 1726882759.70474: done checking for any_errors_fatal 28173 1726882759.70475: checking for max_fail_percentage 28173 1726882759.70476: done checking for max_fail_percentage 28173 1726882759.70477: checking to see if all hosts have failed and the running result is not ok 28173 1726882759.70478: done checking to see if all hosts have failed 28173 1726882759.70479: getting the remaining hosts for this loop 28173 1726882759.70481: done getting the remaining hosts for this loop 28173 1726882759.70485: getting the next task for host managed_node2 28173 1726882759.70491: done getting next task for host managed_node2 28173 1726882759.70496: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28173 1726882759.70499: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882759.70513: getting variables 28173 1726882759.70515: in VariableManager get_vars() 28173 1726882759.70559: Calling all_inventory to load vars for managed_node2 28173 1726882759.70568: Calling groups_inventory to load vars for managed_node2 28173 1726882759.70572: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882759.70583: Calling all_plugins_play to load vars for managed_node2 28173 1726882759.70586: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882759.70589: Calling groups_plugins_play to load vars for managed_node2 28173 1726882759.71583: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000001e 28173 1726882759.71586: WORKER PROCESS EXITING 28173 1726882759.73477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882759.75361: done with get_vars() 28173 1726882759.75385: done getting variables 28173 1726882759.75443: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:39:19 -0400 (0:00:00.247) 0:00:12.919 ****** 28173 1726882759.75478: entering _queue_task() for managed_node2/package 28173 1726882759.75756: worker is 1 (out of 1 available) 28173 1726882759.75770: exiting _queue_task() for managed_node2/package 28173 1726882759.75782: done queuing things up, now waiting for results queue to drain 28173 1726882759.75783: waiting for pending results... 28173 1726882759.76060: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28173 1726882759.76197: in run() - task 0e448fcc-3ce9-926c-8928-00000000001f 28173 1726882759.76217: variable 'ansible_search_path' from source: unknown 28173 1726882759.76228: variable 'ansible_search_path' from source: unknown 28173 1726882759.76267: calling self._execute() 28173 1726882759.76356: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882759.76370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882759.76383: variable 'omit' from source: magic vars 28173 1726882759.76788: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.76805: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882759.76929: variable 'network_state' from source: role '' defaults 28173 1726882759.76942: Evaluated conditional (network_state != {}): False 28173 1726882759.76948: when evaluation is False, skipping this task 28173 1726882759.76954: _execute() done 28173 1726882759.76960: dumping result to json 28173 1726882759.76972: done dumping result, returning 28173 1726882759.76986: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-926c-8928-00000000001f] 28173 1726882759.76999: sending task result for task 0e448fcc-3ce9-926c-8928-00000000001f skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882759.77147: no more pending results, returning what we have 28173 1726882759.77151: results queue empty 28173 1726882759.77152: checking for any_errors_fatal 28173 1726882759.77160: done checking for any_errors_fatal 28173 1726882759.77161: checking for max_fail_percentage 28173 1726882759.77163: done checking for max_fail_percentage 28173 1726882759.77165: checking to see if all hosts have failed and the running result is not ok 28173 1726882759.77166: done checking to see if all hosts have failed 28173 1726882759.77167: getting the remaining hosts for this loop 28173 1726882759.77169: done getting the remaining hosts for this loop 28173 1726882759.77172: getting the next task for host managed_node2 28173 1726882759.77179: done getting next task for host managed_node2 28173 1726882759.77183: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28173 1726882759.77187: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882759.77202: getting variables 28173 1726882759.77204: in VariableManager get_vars() 28173 1726882759.77248: Calling all_inventory to load vars for managed_node2 28173 1726882759.77251: Calling groups_inventory to load vars for managed_node2 28173 1726882759.77253: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882759.77267: Calling all_plugins_play to load vars for managed_node2 28173 1726882759.77270: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882759.77273: Calling groups_plugins_play to load vars for managed_node2 28173 1726882759.78386: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000001f 28173 1726882759.78389: WORKER PROCESS EXITING 28173 1726882759.79871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882759.81908: done with get_vars() 28173 1726882759.81931: done getting variables 28173 1726882759.82227: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:39:19 -0400 (0:00:00.067) 0:00:12.987 ****** 28173 1726882759.82260: entering _queue_task() for managed_node2/package 28173 1726882759.82561: worker is 1 (out of 1 available) 28173 1726882759.82577: exiting _queue_task() for managed_node2/package 28173 1726882759.82589: done queuing things up, now waiting for results queue to drain 28173 1726882759.82590: waiting for pending results... 28173 1726882759.82881: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28173 1726882759.83004: in run() - task 0e448fcc-3ce9-926c-8928-000000000020 28173 1726882759.83017: variable 'ansible_search_path' from source: unknown 28173 1726882759.83020: variable 'ansible_search_path' from source: unknown 28173 1726882759.83066: calling self._execute() 28173 1726882759.83165: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882759.83174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882759.83184: variable 'omit' from source: magic vars 28173 1726882759.83610: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.83622: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882759.83802: variable 'network_state' from source: role '' defaults 28173 1726882759.83812: Evaluated conditional (network_state != {}): False 28173 1726882759.83815: when evaluation is False, skipping this task 28173 1726882759.83824: _execute() done 28173 1726882759.83827: dumping result to json 28173 1726882759.83831: done dumping result, returning 28173 1726882759.83839: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-926c-8928-000000000020] 28173 1726882759.83846: sending task result for task 0e448fcc-3ce9-926c-8928-000000000020 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882759.83997: no more pending results, returning what we have 28173 1726882759.84001: results queue empty 28173 1726882759.84002: checking for any_errors_fatal 28173 1726882759.84012: done checking for any_errors_fatal 28173 1726882759.84013: checking for max_fail_percentage 28173 1726882759.84015: done checking for max_fail_percentage 28173 1726882759.84016: checking to see if all hosts have failed and the running result is not ok 28173 1726882759.84017: done checking to see if all hosts have failed 28173 1726882759.84018: getting the remaining hosts for this loop 28173 1726882759.84019: done getting the remaining hosts for this loop 28173 1726882759.84023: getting the next task for host managed_node2 28173 1726882759.84029: done getting next task for host managed_node2 28173 1726882759.84035: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28173 1726882759.84039: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882759.84060: getting variables 28173 1726882759.84063: in VariableManager get_vars() 28173 1726882759.84111: Calling all_inventory to load vars for managed_node2 28173 1726882759.84113: Calling groups_inventory to load vars for managed_node2 28173 1726882759.84116: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882759.84130: Calling all_plugins_play to load vars for managed_node2 28173 1726882759.84133: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882759.84137: Calling groups_plugins_play to load vars for managed_node2 28173 1726882759.84728: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000020 28173 1726882759.84732: WORKER PROCESS EXITING 28173 1726882759.85786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882759.88481: done with get_vars() 28173 1726882759.88504: done getting variables 28173 1726882759.88630: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:39:19 -0400 (0:00:00.064) 0:00:13.051 ****** 28173 1726882759.88663: entering _queue_task() for managed_node2/service 28173 1726882759.88666: Creating lock for service 28173 1726882759.89087: worker is 1 (out of 1 available) 28173 1726882759.89100: exiting _queue_task() for managed_node2/service 28173 1726882759.89114: done queuing things up, now waiting for results queue to drain 28173 1726882759.89115: waiting for pending results... 28173 1726882759.89312: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28173 1726882759.89405: in run() - task 0e448fcc-3ce9-926c-8928-000000000021 28173 1726882759.89416: variable 'ansible_search_path' from source: unknown 28173 1726882759.89419: variable 'ansible_search_path' from source: unknown 28173 1726882759.89449: calling self._execute() 28173 1726882759.89526: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882759.89530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882759.89538: variable 'omit' from source: magic vars 28173 1726882759.89812: variable 'ansible_distribution_major_version' from source: facts 28173 1726882759.89822: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882759.89907: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882759.90035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882759.92587: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882759.92658: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882759.92714: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882759.92753: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882759.93106: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882759.93194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.93289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.93352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.93403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.93425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.93482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.93519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.93552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.93599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.93617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.93663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882759.93695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882759.93723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.93770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882759.93793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882759.94081: variable 'network_connections' from source: task vars 28173 1726882759.94092: variable 'interface' from source: set_fact 28173 1726882759.94160: variable 'interface' from source: set_fact 28173 1726882759.94170: variable 'interface' from source: set_fact 28173 1726882759.94215: variable 'interface' from source: set_fact 28173 1726882759.94270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882759.94380: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882759.94408: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882759.94432: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882759.94462: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882759.94494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882759.94509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882759.94529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882759.94547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882759.94592: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882759.94741: variable 'network_connections' from source: task vars 28173 1726882759.94744: variable 'interface' from source: set_fact 28173 1726882759.94789: variable 'interface' from source: set_fact 28173 1726882759.94795: variable 'interface' from source: set_fact 28173 1726882759.94835: variable 'interface' from source: set_fact 28173 1726882759.94863: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28173 1726882759.94870: when evaluation is False, skipping this task 28173 1726882759.94873: _execute() done 28173 1726882759.94875: dumping result to json 28173 1726882759.94877: done dumping result, returning 28173 1726882759.94883: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-000000000021] 28173 1726882759.94892: sending task result for task 0e448fcc-3ce9-926c-8928-000000000021 28173 1726882759.94976: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000021 28173 1726882759.94980: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28173 1726882759.95022: no more pending results, returning what we have 28173 1726882759.95025: results queue empty 28173 1726882759.95025: checking for any_errors_fatal 28173 1726882759.95032: done checking for any_errors_fatal 28173 1726882759.95032: checking for max_fail_percentage 28173 1726882759.95034: done checking for max_fail_percentage 28173 1726882759.95035: checking to see if all hosts have failed and the running result is not ok 28173 1726882759.95036: done checking to see if all hosts have failed 28173 1726882759.95036: getting the remaining hosts for this loop 28173 1726882759.95038: done getting the remaining hosts for this loop 28173 1726882759.95041: getting the next task for host managed_node2 28173 1726882759.95048: done getting next task for host managed_node2 28173 1726882759.95051: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28173 1726882759.95054: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882759.95072: getting variables 28173 1726882759.95075: in VariableManager get_vars() 28173 1726882759.95114: Calling all_inventory to load vars for managed_node2 28173 1726882759.95116: Calling groups_inventory to load vars for managed_node2 28173 1726882759.95118: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882759.95127: Calling all_plugins_play to load vars for managed_node2 28173 1726882759.95129: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882759.95132: Calling groups_plugins_play to load vars for managed_node2 28173 1726882759.97277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882759.99237: done with get_vars() 28173 1726882759.99259: done getting variables 28173 1726882759.99322: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:39:19 -0400 (0:00:00.106) 0:00:13.157 ****** 28173 1726882759.99355: entering _queue_task() for managed_node2/service 28173 1726882759.99628: worker is 1 (out of 1 available) 28173 1726882759.99639: exiting _queue_task() for managed_node2/service 28173 1726882759.99650: done queuing things up, now waiting for results queue to drain 28173 1726882759.99651: waiting for pending results... 28173 1726882759.99931: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28173 1726882760.00072: in run() - task 0e448fcc-3ce9-926c-8928-000000000022 28173 1726882760.00098: variable 'ansible_search_path' from source: unknown 28173 1726882760.00105: variable 'ansible_search_path' from source: unknown 28173 1726882760.00144: calling self._execute() 28173 1726882760.00245: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882760.00256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882760.00276: variable 'omit' from source: magic vars 28173 1726882760.00651: variable 'ansible_distribution_major_version' from source: facts 28173 1726882760.00675: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882760.00838: variable 'network_provider' from source: set_fact 28173 1726882760.00851: variable 'network_state' from source: role '' defaults 28173 1726882760.00868: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28173 1726882760.00880: variable 'omit' from source: magic vars 28173 1726882760.00933: variable 'omit' from source: magic vars 28173 1726882760.00975: variable 'network_service_name' from source: role '' defaults 28173 1726882760.01047: variable 'network_service_name' from source: role '' defaults 28173 1726882760.01168: variable '__network_provider_setup' from source: role '' defaults 28173 1726882760.01182: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882760.01248: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882760.01262: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882760.01335: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882760.01579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882760.03933: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882760.04019: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882760.04062: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882760.04110: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882760.04140: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882760.04224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882760.04258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882760.04300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882760.04350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882760.04375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882760.04427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882760.04457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882760.04491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882760.04541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882760.04562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882760.04803: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28173 1726882760.04921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882760.04949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882760.04988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882760.05032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882760.05050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882760.05144: variable 'ansible_python' from source: facts 28173 1726882760.05177: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28173 1726882760.05257: variable '__network_wpa_supplicant_required' from source: role '' defaults 28173 1726882760.05346: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28173 1726882760.05482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882760.05517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882760.05545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882760.05593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882760.05612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882760.05668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882760.05707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882760.05739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882760.05788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882760.05806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882760.05948: variable 'network_connections' from source: task vars 28173 1726882760.05961: variable 'interface' from source: set_fact 28173 1726882760.06038: variable 'interface' from source: set_fact 28173 1726882760.06057: variable 'interface' from source: set_fact 28173 1726882760.06131: variable 'interface' from source: set_fact 28173 1726882760.06267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882760.06455: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882760.06512: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882760.06556: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882760.06609: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882760.06676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882760.06716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882760.06753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882760.06796: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882760.06848: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882760.07155: variable 'network_connections' from source: task vars 28173 1726882760.07172: variable 'interface' from source: set_fact 28173 1726882760.07253: variable 'interface' from source: set_fact 28173 1726882760.07275: variable 'interface' from source: set_fact 28173 1726882760.07350: variable 'interface' from source: set_fact 28173 1726882760.07434: variable '__network_packages_default_wireless' from source: role '' defaults 28173 1726882760.07517: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882760.07802: variable 'network_connections' from source: task vars 28173 1726882760.07811: variable 'interface' from source: set_fact 28173 1726882760.07884: variable 'interface' from source: set_fact 28173 1726882760.07900: variable 'interface' from source: set_fact 28173 1726882760.07970: variable 'interface' from source: set_fact 28173 1726882760.08006: variable '__network_packages_default_team' from source: role '' defaults 28173 1726882760.08086: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882760.08382: variable 'network_connections' from source: task vars 28173 1726882760.08391: variable 'interface' from source: set_fact 28173 1726882760.08461: variable 'interface' from source: set_fact 28173 1726882760.08476: variable 'interface' from source: set_fact 28173 1726882760.08547: variable 'interface' from source: set_fact 28173 1726882760.08616: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882760.08683: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882760.08695: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882760.08754: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882760.08974: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28173 1726882760.09648: variable 'network_connections' from source: task vars 28173 1726882760.09659: variable 'interface' from source: set_fact 28173 1726882760.09726: variable 'interface' from source: set_fact 28173 1726882760.09738: variable 'interface' from source: set_fact 28173 1726882760.09804: variable 'interface' from source: set_fact 28173 1726882760.09823: variable 'ansible_distribution' from source: facts 28173 1726882760.09831: variable '__network_rh_distros' from source: role '' defaults 28173 1726882760.09840: variable 'ansible_distribution_major_version' from source: facts 28173 1726882760.09872: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28173 1726882760.10045: variable 'ansible_distribution' from source: facts 28173 1726882760.10054: variable '__network_rh_distros' from source: role '' defaults 28173 1726882760.10068: variable 'ansible_distribution_major_version' from source: facts 28173 1726882760.10087: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28173 1726882760.10260: variable 'ansible_distribution' from source: facts 28173 1726882760.10274: variable '__network_rh_distros' from source: role '' defaults 28173 1726882760.10286: variable 'ansible_distribution_major_version' from source: facts 28173 1726882760.10326: variable 'network_provider' from source: set_fact 28173 1726882760.10352: variable 'omit' from source: magic vars 28173 1726882760.10385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882760.10421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882760.10441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882760.10462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882760.10483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882760.10519: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882760.10527: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882760.10535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882760.10612: Set connection var ansible_pipelining to False 28173 1726882760.10622: Set connection var ansible_shell_type to sh 28173 1726882760.10646: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882760.10649: Set connection var ansible_timeout to 10 28173 1726882760.10673: Set connection var ansible_shell_executable to /bin/sh 28173 1726882760.10676: Set connection var ansible_connection to ssh 28173 1726882760.10688: variable 'ansible_shell_executable' from source: unknown 28173 1726882760.10691: variable 'ansible_connection' from source: unknown 28173 1726882760.10693: variable 'ansible_module_compression' from source: unknown 28173 1726882760.10695: variable 'ansible_shell_type' from source: unknown 28173 1726882760.10698: variable 'ansible_shell_executable' from source: unknown 28173 1726882760.10700: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882760.10705: variable 'ansible_pipelining' from source: unknown 28173 1726882760.10707: variable 'ansible_timeout' from source: unknown 28173 1726882760.10709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882760.10786: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882760.10794: variable 'omit' from source: magic vars 28173 1726882760.10799: starting attempt loop 28173 1726882760.10802: running the handler 28173 1726882760.10867: variable 'ansible_facts' from source: unknown 28173 1726882760.11324: _low_level_execute_command(): starting 28173 1726882760.11330: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882760.11801: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882760.11809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882760.11836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882760.11849: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882760.11861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882760.11913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882760.11919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882760.11931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882760.12048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882760.13710: stdout chunk (state=3): >>>/root <<< 28173 1726882760.13946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882760.13949: stdout chunk (state=3): >>><<< 28173 1726882760.13952: stderr chunk (state=3): >>><<< 28173 1726882760.13955: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882760.13957: _low_level_execute_command(): starting 28173 1726882760.13960: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882760.1389303-28790-226841761938645 `" && echo ansible-tmp-1726882760.1389303-28790-226841761938645="` echo /root/.ansible/tmp/ansible-tmp-1726882760.1389303-28790-226841761938645 `" ) && sleep 0' 28173 1726882760.14595: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882760.14598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882760.14668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882760.14679: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882760.14689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882760.14695: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882760.14704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882760.14710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882760.14777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882760.14792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882760.14802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882760.14913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882760.16785: stdout chunk (state=3): >>>ansible-tmp-1726882760.1389303-28790-226841761938645=/root/.ansible/tmp/ansible-tmp-1726882760.1389303-28790-226841761938645 <<< 28173 1726882760.16894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882760.16953: stderr chunk (state=3): >>><<< 28173 1726882760.16956: stdout chunk (state=3): >>><<< 28173 1726882760.16976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882760.1389303-28790-226841761938645=/root/.ansible/tmp/ansible-tmp-1726882760.1389303-28790-226841761938645 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882760.17002: variable 'ansible_module_compression' from source: unknown 28173 1726882760.17059: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 28173 1726882760.17071: ANSIBALLZ: Acquiring lock 28173 1726882760.17075: ANSIBALLZ: Lock acquired: 140243978110592 28173 1726882760.17077: ANSIBALLZ: Creating module 28173 1726882760.43817: ANSIBALLZ: Writing module into payload 28173 1726882760.44219: ANSIBALLZ: Writing module 28173 1726882760.44247: ANSIBALLZ: Renaming module 28173 1726882760.44254: ANSIBALLZ: Done creating module 28173 1726882760.44499: variable 'ansible_facts' from source: unknown 28173 1726882760.44954: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882760.1389303-28790-226841761938645/AnsiballZ_systemd.py 28173 1726882760.45205: Sending initial data 28173 1726882760.45214: Sent initial data (156 bytes) 28173 1726882760.46191: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882760.46209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882760.46233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882760.46253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882760.46298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882760.46310: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882760.46323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882760.46345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882760.46356: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882760.46371: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882760.46383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882760.46395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882760.46410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882760.46420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882760.46430: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882760.46442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882760.46520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882760.46542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882760.46564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882760.46704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882760.48549: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882760.48641: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882760.48749: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpsrms2z_a /root/.ansible/tmp/ansible-tmp-1726882760.1389303-28790-226841761938645/AnsiballZ_systemd.py <<< 28173 1726882760.48837: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882760.52562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882760.52627: stderr chunk (state=3): >>><<< 28173 1726882760.52630: stdout chunk (state=3): >>><<< 28173 1726882760.52650: done transferring module to remote 28173 1726882760.52741: _low_level_execute_command(): starting 28173 1726882760.52745: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882760.1389303-28790-226841761938645/ /root/.ansible/tmp/ansible-tmp-1726882760.1389303-28790-226841761938645/AnsiballZ_systemd.py && sleep 0' 28173 1726882760.53341: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882760.53355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882760.53371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882760.53393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882760.53435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882760.53447: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882760.53461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882760.53483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882760.53497: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882760.53514: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882760.53542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882760.53569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882760.53588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882760.53610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882760.53637: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882760.53660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882760.53761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882760.53790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882760.53814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882760.53939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882760.55805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882760.55826: stderr chunk (state=3): >>><<< 28173 1726882760.55829: stdout chunk (state=3): >>><<< 28173 1726882760.55920: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882760.55927: _low_level_execute_command(): starting 28173 1726882760.55931: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882760.1389303-28790-226841761938645/AnsiballZ_systemd.py && sleep 0' 28173 1726882760.56900: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882760.56903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882760.56944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882760.56949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882760.56954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882760.56972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882760.56977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882760.57055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882760.57061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882760.57271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882760.82116: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 28173 1726882760.82175: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "9207808", "MemoryAvailable": "infinity", "CPUUsageNSec": "1929724000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 28173 1726882760.82183: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28173 1726882760.83757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882760.83761: stdout chunk (state=3): >>><<< 28173 1726882760.83773: stderr chunk (state=3): >>><<< 28173 1726882760.83789: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9207808", "MemoryAvailable": "infinity", "CPUUsageNSec": "1929724000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882760.83980: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882760.1389303-28790-226841761938645/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882760.83999: _low_level_execute_command(): starting 28173 1726882760.84003: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882760.1389303-28790-226841761938645/ > /dev/null 2>&1 && sleep 0' 28173 1726882760.84666: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882760.84679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882760.84689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882760.84703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882760.84746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882760.84752: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882760.84761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882760.84780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882760.84787: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882760.84793: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882760.84801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882760.84810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882760.84820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882760.84827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882760.84840: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882760.84849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882760.84925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882760.84943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882760.84957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882760.85090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882760.86978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882760.86981: stdout chunk (state=3): >>><<< 28173 1726882760.86989: stderr chunk (state=3): >>><<< 28173 1726882760.87005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882760.87012: handler run complete 28173 1726882760.87076: attempt loop complete, returning result 28173 1726882760.87080: _execute() done 28173 1726882760.87082: dumping result to json 28173 1726882760.87101: done dumping result, returning 28173 1726882760.87112: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-926c-8928-000000000022] 28173 1726882760.87118: sending task result for task 0e448fcc-3ce9-926c-8928-000000000022 28173 1726882760.87460: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000022 28173 1726882760.87469: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882760.87522: no more pending results, returning what we have 28173 1726882760.87526: results queue empty 28173 1726882760.87526: checking for any_errors_fatal 28173 1726882760.87534: done checking for any_errors_fatal 28173 1726882760.87535: checking for max_fail_percentage 28173 1726882760.87536: done checking for max_fail_percentage 28173 1726882760.87537: checking to see if all hosts have failed and the running result is not ok 28173 1726882760.87538: done checking to see if all hosts have failed 28173 1726882760.87539: getting the remaining hosts for this loop 28173 1726882760.87540: done getting the remaining hosts for this loop 28173 1726882760.87544: getting the next task for host managed_node2 28173 1726882760.87551: done getting next task for host managed_node2 28173 1726882760.87555: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28173 1726882760.87558: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882760.87578: getting variables 28173 1726882760.87581: in VariableManager get_vars() 28173 1726882760.87620: Calling all_inventory to load vars for managed_node2 28173 1726882760.87622: Calling groups_inventory to load vars for managed_node2 28173 1726882760.87625: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882760.87637: Calling all_plugins_play to load vars for managed_node2 28173 1726882760.87640: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882760.87643: Calling groups_plugins_play to load vars for managed_node2 28173 1726882760.89783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882760.91678: done with get_vars() 28173 1726882760.91701: done getting variables 28173 1726882760.91770: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:39:20 -0400 (0:00:00.924) 0:00:14.082 ****** 28173 1726882760.91811: entering _queue_task() for managed_node2/service 28173 1726882760.92137: worker is 1 (out of 1 available) 28173 1726882760.92149: exiting _queue_task() for managed_node2/service 28173 1726882760.92169: done queuing things up, now waiting for results queue to drain 28173 1726882760.92171: waiting for pending results... 28173 1726882760.92591: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28173 1726882760.92757: in run() - task 0e448fcc-3ce9-926c-8928-000000000023 28173 1726882760.92776: variable 'ansible_search_path' from source: unknown 28173 1726882760.92780: variable 'ansible_search_path' from source: unknown 28173 1726882760.92819: calling self._execute() 28173 1726882760.92926: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882760.92932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882760.92942: variable 'omit' from source: magic vars 28173 1726882760.93330: variable 'ansible_distribution_major_version' from source: facts 28173 1726882760.93343: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882760.93475: variable 'network_provider' from source: set_fact 28173 1726882760.93482: Evaluated conditional (network_provider == "nm"): True 28173 1726882760.93580: variable '__network_wpa_supplicant_required' from source: role '' defaults 28173 1726882760.93654: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28173 1726882760.93826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882760.98078: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882760.98153: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882760.98197: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882760.98241: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882760.98275: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882760.98370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882760.98404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882760.98434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882760.98486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882760.98504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882760.98559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882760.98589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882760.98616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882760.98657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882760.98684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882760.98726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882760.98752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882760.98785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882760.98828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882760.98845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882760.98999: variable 'network_connections' from source: task vars 28173 1726882760.99015: variable 'interface' from source: set_fact 28173 1726882760.99094: variable 'interface' from source: set_fact 28173 1726882760.99109: variable 'interface' from source: set_fact 28173 1726882760.99190: variable 'interface' from source: set_fact 28173 1726882760.99485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882760.99647: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882760.99693: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882760.99727: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882760.99758: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882760.99809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882760.99834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882760.99861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882760.99899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882760.99949: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882761.00226: variable 'network_connections' from source: task vars 28173 1726882761.00236: variable 'interface' from source: set_fact 28173 1726882761.00300: variable 'interface' from source: set_fact 28173 1726882761.00311: variable 'interface' from source: set_fact 28173 1726882761.00380: variable 'interface' from source: set_fact 28173 1726882761.00428: Evaluated conditional (__network_wpa_supplicant_required): False 28173 1726882761.00440: when evaluation is False, skipping this task 28173 1726882761.00448: _execute() done 28173 1726882761.00461: dumping result to json 28173 1726882761.00469: done dumping result, returning 28173 1726882761.00479: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-926c-8928-000000000023] 28173 1726882761.00486: sending task result for task 0e448fcc-3ce9-926c-8928-000000000023 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28173 1726882761.00624: no more pending results, returning what we have 28173 1726882761.00627: results queue empty 28173 1726882761.00628: checking for any_errors_fatal 28173 1726882761.00653: done checking for any_errors_fatal 28173 1726882761.00653: checking for max_fail_percentage 28173 1726882761.00655: done checking for max_fail_percentage 28173 1726882761.00656: checking to see if all hosts have failed and the running result is not ok 28173 1726882761.00657: done checking to see if all hosts have failed 28173 1726882761.00657: getting the remaining hosts for this loop 28173 1726882761.00659: done getting the remaining hosts for this loop 28173 1726882761.00662: getting the next task for host managed_node2 28173 1726882761.00675: done getting next task for host managed_node2 28173 1726882761.00679: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28173 1726882761.00682: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882761.00695: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000023 28173 1726882761.00699: WORKER PROCESS EXITING 28173 1726882761.00706: getting variables 28173 1726882761.00708: in VariableManager get_vars() 28173 1726882761.00749: Calling all_inventory to load vars for managed_node2 28173 1726882761.00751: Calling groups_inventory to load vars for managed_node2 28173 1726882761.00753: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882761.00765: Calling all_plugins_play to load vars for managed_node2 28173 1726882761.00771: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882761.00774: Calling groups_plugins_play to load vars for managed_node2 28173 1726882761.02427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882761.04231: done with get_vars() 28173 1726882761.04257: done getting variables 28173 1726882761.04324: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:39:21 -0400 (0:00:00.125) 0:00:14.207 ****** 28173 1726882761.04359: entering _queue_task() for managed_node2/service 28173 1726882761.04924: worker is 1 (out of 1 available) 28173 1726882761.04937: exiting _queue_task() for managed_node2/service 28173 1726882761.04948: done queuing things up, now waiting for results queue to drain 28173 1726882761.04950: waiting for pending results... 28173 1726882761.05454: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28173 1726882761.05582: in run() - task 0e448fcc-3ce9-926c-8928-000000000024 28173 1726882761.05598: variable 'ansible_search_path' from source: unknown 28173 1726882761.05601: variable 'ansible_search_path' from source: unknown 28173 1726882761.05642: calling self._execute() 28173 1726882761.05740: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882761.05744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882761.05755: variable 'omit' from source: magic vars 28173 1726882761.06175: variable 'ansible_distribution_major_version' from source: facts 28173 1726882761.06190: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882761.06316: variable 'network_provider' from source: set_fact 28173 1726882761.06319: Evaluated conditional (network_provider == "initscripts"): False 28173 1726882761.06322: when evaluation is False, skipping this task 28173 1726882761.06324: _execute() done 28173 1726882761.06328: dumping result to json 28173 1726882761.06331: done dumping result, returning 28173 1726882761.06340: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-926c-8928-000000000024] 28173 1726882761.06347: sending task result for task 0e448fcc-3ce9-926c-8928-000000000024 28173 1726882761.06451: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000024 28173 1726882761.06454: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882761.06529: no more pending results, returning what we have 28173 1726882761.06533: results queue empty 28173 1726882761.06533: checking for any_errors_fatal 28173 1726882761.06544: done checking for any_errors_fatal 28173 1726882761.06545: checking for max_fail_percentage 28173 1726882761.06547: done checking for max_fail_percentage 28173 1726882761.06548: checking to see if all hosts have failed and the running result is not ok 28173 1726882761.06549: done checking to see if all hosts have failed 28173 1726882761.06550: getting the remaining hosts for this loop 28173 1726882761.06551: done getting the remaining hosts for this loop 28173 1726882761.06555: getting the next task for host managed_node2 28173 1726882761.06561: done getting next task for host managed_node2 28173 1726882761.06569: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28173 1726882761.06573: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882761.06590: getting variables 28173 1726882761.06592: in VariableManager get_vars() 28173 1726882761.06634: Calling all_inventory to load vars for managed_node2 28173 1726882761.06637: Calling groups_inventory to load vars for managed_node2 28173 1726882761.06640: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882761.06652: Calling all_plugins_play to load vars for managed_node2 28173 1726882761.06655: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882761.06659: Calling groups_plugins_play to load vars for managed_node2 28173 1726882761.08943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882761.11861: done with get_vars() 28173 1726882761.11893: done getting variables 28173 1726882761.11955: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:39:21 -0400 (0:00:00.076) 0:00:14.284 ****** 28173 1726882761.11999: entering _queue_task() for managed_node2/copy 28173 1726882761.12311: worker is 1 (out of 1 available) 28173 1726882761.12322: exiting _queue_task() for managed_node2/copy 28173 1726882761.12334: done queuing things up, now waiting for results queue to drain 28173 1726882761.12335: waiting for pending results... 28173 1726882761.12619: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28173 1726882761.12739: in run() - task 0e448fcc-3ce9-926c-8928-000000000025 28173 1726882761.12754: variable 'ansible_search_path' from source: unknown 28173 1726882761.12757: variable 'ansible_search_path' from source: unknown 28173 1726882761.12797: calling self._execute() 28173 1726882761.12890: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882761.12894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882761.12904: variable 'omit' from source: magic vars 28173 1726882761.13389: variable 'ansible_distribution_major_version' from source: facts 28173 1726882761.13401: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882761.13551: variable 'network_provider' from source: set_fact 28173 1726882761.13554: Evaluated conditional (network_provider == "initscripts"): False 28173 1726882761.13557: when evaluation is False, skipping this task 28173 1726882761.13559: _execute() done 28173 1726882761.13565: dumping result to json 28173 1726882761.13571: done dumping result, returning 28173 1726882761.13595: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-926c-8928-000000000025] 28173 1726882761.13601: sending task result for task 0e448fcc-3ce9-926c-8928-000000000025 28173 1726882761.13707: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000025 28173 1726882761.13710: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28173 1726882761.13757: no more pending results, returning what we have 28173 1726882761.13761: results queue empty 28173 1726882761.13762: checking for any_errors_fatal 28173 1726882761.13774: done checking for any_errors_fatal 28173 1726882761.13775: checking for max_fail_percentage 28173 1726882761.13777: done checking for max_fail_percentage 28173 1726882761.13778: checking to see if all hosts have failed and the running result is not ok 28173 1726882761.13779: done checking to see if all hosts have failed 28173 1726882761.13780: getting the remaining hosts for this loop 28173 1726882761.13781: done getting the remaining hosts for this loop 28173 1726882761.13785: getting the next task for host managed_node2 28173 1726882761.13791: done getting next task for host managed_node2 28173 1726882761.13795: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28173 1726882761.13798: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882761.13813: getting variables 28173 1726882761.13815: in VariableManager get_vars() 28173 1726882761.13875: Calling all_inventory to load vars for managed_node2 28173 1726882761.13878: Calling groups_inventory to load vars for managed_node2 28173 1726882761.13881: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882761.13893: Calling all_plugins_play to load vars for managed_node2 28173 1726882761.13896: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882761.13899: Calling groups_plugins_play to load vars for managed_node2 28173 1726882761.15611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882761.18284: done with get_vars() 28173 1726882761.18306: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:39:21 -0400 (0:00:00.063) 0:00:14.348 ****** 28173 1726882761.18397: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28173 1726882761.18399: Creating lock for fedora.linux_system_roles.network_connections 28173 1726882761.18711: worker is 1 (out of 1 available) 28173 1726882761.18725: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28173 1726882761.18738: done queuing things up, now waiting for results queue to drain 28173 1726882761.18740: waiting for pending results... 28173 1726882761.19066: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28173 1726882761.19197: in run() - task 0e448fcc-3ce9-926c-8928-000000000026 28173 1726882761.19211: variable 'ansible_search_path' from source: unknown 28173 1726882761.19215: variable 'ansible_search_path' from source: unknown 28173 1726882761.19247: calling self._execute() 28173 1726882761.19335: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882761.19340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882761.19351: variable 'omit' from source: magic vars 28173 1726882761.19779: variable 'ansible_distribution_major_version' from source: facts 28173 1726882761.19792: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882761.19799: variable 'omit' from source: magic vars 28173 1726882761.19887: variable 'omit' from source: magic vars 28173 1726882761.20121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882761.22576: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882761.22638: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882761.22681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882761.22715: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882761.22741: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882761.22821: variable 'network_provider' from source: set_fact 28173 1726882761.22941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882761.22986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882761.23015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882761.23054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882761.23072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882761.23146: variable 'omit' from source: magic vars 28173 1726882761.23260: variable 'omit' from source: magic vars 28173 1726882761.23368: variable 'network_connections' from source: task vars 28173 1726882761.23380: variable 'interface' from source: set_fact 28173 1726882761.23447: variable 'interface' from source: set_fact 28173 1726882761.23454: variable 'interface' from source: set_fact 28173 1726882761.23517: variable 'interface' from source: set_fact 28173 1726882761.23732: variable 'omit' from source: magic vars 28173 1726882761.23740: variable '__lsr_ansible_managed' from source: task vars 28173 1726882761.23805: variable '__lsr_ansible_managed' from source: task vars 28173 1726882761.24075: Loaded config def from plugin (lookup/template) 28173 1726882761.24081: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28173 1726882761.24109: File lookup term: get_ansible_managed.j2 28173 1726882761.24112: variable 'ansible_search_path' from source: unknown 28173 1726882761.24115: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28173 1726882761.24127: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28173 1726882761.24141: variable 'ansible_search_path' from source: unknown 28173 1726882761.30592: variable 'ansible_managed' from source: unknown 28173 1726882761.30713: variable 'omit' from source: magic vars 28173 1726882761.30736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882761.30761: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882761.30784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882761.30801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882761.30812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882761.30838: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882761.30842: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882761.30844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882761.30931: Set connection var ansible_pipelining to False 28173 1726882761.30934: Set connection var ansible_shell_type to sh 28173 1726882761.30942: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882761.30949: Set connection var ansible_timeout to 10 28173 1726882761.30955: Set connection var ansible_shell_executable to /bin/sh 28173 1726882761.30960: Set connection var ansible_connection to ssh 28173 1726882761.30987: variable 'ansible_shell_executable' from source: unknown 28173 1726882761.30990: variable 'ansible_connection' from source: unknown 28173 1726882761.30992: variable 'ansible_module_compression' from source: unknown 28173 1726882761.30995: variable 'ansible_shell_type' from source: unknown 28173 1726882761.30998: variable 'ansible_shell_executable' from source: unknown 28173 1726882761.31000: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882761.31002: variable 'ansible_pipelining' from source: unknown 28173 1726882761.31004: variable 'ansible_timeout' from source: unknown 28173 1726882761.31012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882761.31159: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882761.31174: variable 'omit' from source: magic vars 28173 1726882761.31181: starting attempt loop 28173 1726882761.31184: running the handler 28173 1726882761.31198: _low_level_execute_command(): starting 28173 1726882761.31203: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882761.31914: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882761.31925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882761.31938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882761.31950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882761.31995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882761.32000: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882761.32010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882761.32027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882761.32030: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882761.32035: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882761.32044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882761.32053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882761.32065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882761.32076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882761.32083: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882761.32093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882761.32334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882761.32338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882761.32346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882761.32582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882761.34151: stdout chunk (state=3): >>>/root <<< 28173 1726882761.34252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882761.34334: stderr chunk (state=3): >>><<< 28173 1726882761.34337: stdout chunk (state=3): >>><<< 28173 1726882761.34465: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882761.34469: _low_level_execute_command(): starting 28173 1726882761.34472: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882761.3436365-28840-235306309796996 `" && echo ansible-tmp-1726882761.3436365-28840-235306309796996="` echo /root/.ansible/tmp/ansible-tmp-1726882761.3436365-28840-235306309796996 `" ) && sleep 0' 28173 1726882761.35061: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882761.35082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882761.35098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882761.35124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882761.35171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882761.35185: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882761.35199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882761.35258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882761.35275: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882761.35286: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882761.35297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882761.35310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882761.35326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882761.35338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882761.35349: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882761.35374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882761.35446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882761.35477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882761.35492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882761.35668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882761.37508: stdout chunk (state=3): >>>ansible-tmp-1726882761.3436365-28840-235306309796996=/root/.ansible/tmp/ansible-tmp-1726882761.3436365-28840-235306309796996 <<< 28173 1726882761.37675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882761.37678: stdout chunk (state=3): >>><<< 28173 1726882761.37690: stderr chunk (state=3): >>><<< 28173 1726882761.37876: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882761.3436365-28840-235306309796996=/root/.ansible/tmp/ansible-tmp-1726882761.3436365-28840-235306309796996 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882761.37883: variable 'ansible_module_compression' from source: unknown 28173 1726882761.37885: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 28173 1726882761.37888: ANSIBALLZ: Acquiring lock 28173 1726882761.37890: ANSIBALLZ: Lock acquired: 140243974027408 28173 1726882761.37892: ANSIBALLZ: Creating module 28173 1726882761.67932: ANSIBALLZ: Writing module into payload 28173 1726882761.68420: ANSIBALLZ: Writing module 28173 1726882761.68452: ANSIBALLZ: Renaming module 28173 1726882761.68465: ANSIBALLZ: Done creating module 28173 1726882761.68492: variable 'ansible_facts' from source: unknown 28173 1726882761.68599: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882761.3436365-28840-235306309796996/AnsiballZ_network_connections.py 28173 1726882761.68761: Sending initial data 28173 1726882761.68766: Sent initial data (168 bytes) 28173 1726882761.69896: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882761.69909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882761.69922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882761.69937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882761.69979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882761.70079: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882761.70095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882761.70112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882761.70125: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882761.70137: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882761.70149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882761.70166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882761.70183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882761.70196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882761.70208: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882761.70222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882761.70294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882761.70483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882761.70498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882761.70645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882761.72476: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882761.72580: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882761.72705: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpgfgjwuvu /root/.ansible/tmp/ansible-tmp-1726882761.3436365-28840-235306309796996/AnsiballZ_network_connections.py <<< 28173 1726882761.72786: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882761.74555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882761.74676: stderr chunk (state=3): >>><<< 28173 1726882761.74679: stdout chunk (state=3): >>><<< 28173 1726882761.74682: done transferring module to remote 28173 1726882761.74740: _low_level_execute_command(): starting 28173 1726882761.74743: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882761.3436365-28840-235306309796996/ /root/.ansible/tmp/ansible-tmp-1726882761.3436365-28840-235306309796996/AnsiballZ_network_connections.py && sleep 0' 28173 1726882761.75285: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882761.75295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882761.75301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882761.75314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882761.75349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882761.75356: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882761.75370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882761.75381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882761.75388: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882761.75396: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882761.75405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882761.75410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882761.75422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882761.75429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882761.75435: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882761.75444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882761.75532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882761.75554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882761.75570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882761.75689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882761.77456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882761.77501: stderr chunk (state=3): >>><<< 28173 1726882761.77504: stdout chunk (state=3): >>><<< 28173 1726882761.77543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882761.77550: _low_level_execute_command(): starting 28173 1726882761.77553: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882761.3436365-28840-235306309796996/AnsiballZ_network_connections.py && sleep 0' 28173 1726882761.78144: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882761.78158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882761.78173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882761.78187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882761.78222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882761.78229: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882761.78238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882761.78252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882761.78269: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882761.78280: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882761.78289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882761.78296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882761.78308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882761.78316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882761.78323: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882761.78332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882761.78419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882761.78423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882761.78433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882761.78570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882762.06237: stdout chunk (state=3): >>> <<< 28173 1726882762.06242: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28173 1726882762.08473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882762.08477: stdout chunk (state=3): >>><<< 28173 1726882762.08488: stderr chunk (state=3): >>><<< 28173 1726882762.08668: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882762.08673: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26'], 'route': [{'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 30400}, {'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 30200}, {'network': '192.0.2.64', 'prefix': 26, 'gateway': '198.51.100.8', 'metric': 50, 'table': 30200, 'src': '198.51.100.3'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882761.3436365-28840-235306309796996/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882762.08681: _low_level_execute_command(): starting 28173 1726882762.08684: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882761.3436365-28840-235306309796996/ > /dev/null 2>&1 && sleep 0' 28173 1726882762.09983: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882762.09986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.10030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.10033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.10035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.10113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882762.10135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882762.10138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882762.10250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882762.12070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882762.12139: stderr chunk (state=3): >>><<< 28173 1726882762.12372: stdout chunk (state=3): >>><<< 28173 1726882762.12376: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882762.12378: handler run complete 28173 1726882762.12380: attempt loop complete, returning result 28173 1726882762.12382: _execute() done 28173 1726882762.12384: dumping result to json 28173 1726882762.12386: done dumping result, returning 28173 1726882762.12388: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-926c-8928-000000000026] 28173 1726882762.12389: sending task result for task 0e448fcc-3ce9-926c-8928-000000000026 28173 1726882762.12474: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000026 28173 1726882762.12477: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": 30200 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 (not-active) 28173 1726882762.12631: no more pending results, returning what we have 28173 1726882762.12634: results queue empty 28173 1726882762.12635: checking for any_errors_fatal 28173 1726882762.12644: done checking for any_errors_fatal 28173 1726882762.12645: checking for max_fail_percentage 28173 1726882762.12646: done checking for max_fail_percentage 28173 1726882762.12647: checking to see if all hosts have failed and the running result is not ok 28173 1726882762.12648: done checking to see if all hosts have failed 28173 1726882762.12649: getting the remaining hosts for this loop 28173 1726882762.12650: done getting the remaining hosts for this loop 28173 1726882762.12654: getting the next task for host managed_node2 28173 1726882762.12661: done getting next task for host managed_node2 28173 1726882762.12666: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28173 1726882762.12670: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882762.12683: getting variables 28173 1726882762.12685: in VariableManager get_vars() 28173 1726882762.12727: Calling all_inventory to load vars for managed_node2 28173 1726882762.12730: Calling groups_inventory to load vars for managed_node2 28173 1726882762.12732: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882762.12744: Calling all_plugins_play to load vars for managed_node2 28173 1726882762.12748: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882762.12756: Calling groups_plugins_play to load vars for managed_node2 28173 1726882762.14528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882762.16196: done with get_vars() 28173 1726882762.16217: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:39:22 -0400 (0:00:00.979) 0:00:15.327 ****** 28173 1726882762.16302: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28173 1726882762.16304: Creating lock for fedora.linux_system_roles.network_state 28173 1726882762.17174: worker is 1 (out of 1 available) 28173 1726882762.17184: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28173 1726882762.17195: done queuing things up, now waiting for results queue to drain 28173 1726882762.17196: waiting for pending results... 28173 1726882762.17298: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28173 1726882762.17414: in run() - task 0e448fcc-3ce9-926c-8928-000000000027 28173 1726882762.17437: variable 'ansible_search_path' from source: unknown 28173 1726882762.17445: variable 'ansible_search_path' from source: unknown 28173 1726882762.17489: calling self._execute() 28173 1726882762.17585: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.17596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.17609: variable 'omit' from source: magic vars 28173 1726882762.17984: variable 'ansible_distribution_major_version' from source: facts 28173 1726882762.18001: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882762.18130: variable 'network_state' from source: role '' defaults 28173 1726882762.18143: Evaluated conditional (network_state != {}): False 28173 1726882762.18149: when evaluation is False, skipping this task 28173 1726882762.18155: _execute() done 28173 1726882762.18162: dumping result to json 28173 1726882762.18171: done dumping result, returning 28173 1726882762.18185: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-926c-8928-000000000027] 28173 1726882762.18194: sending task result for task 0e448fcc-3ce9-926c-8928-000000000027 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882762.18337: no more pending results, returning what we have 28173 1726882762.18341: results queue empty 28173 1726882762.18342: checking for any_errors_fatal 28173 1726882762.18358: done checking for any_errors_fatal 28173 1726882762.18358: checking for max_fail_percentage 28173 1726882762.18360: done checking for max_fail_percentage 28173 1726882762.18361: checking to see if all hosts have failed and the running result is not ok 28173 1726882762.18362: done checking to see if all hosts have failed 28173 1726882762.18365: getting the remaining hosts for this loop 28173 1726882762.18366: done getting the remaining hosts for this loop 28173 1726882762.18370: getting the next task for host managed_node2 28173 1726882762.18377: done getting next task for host managed_node2 28173 1726882762.18381: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28173 1726882762.18385: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882762.18400: getting variables 28173 1726882762.18402: in VariableManager get_vars() 28173 1726882762.18445: Calling all_inventory to load vars for managed_node2 28173 1726882762.18448: Calling groups_inventory to load vars for managed_node2 28173 1726882762.18450: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882762.18465: Calling all_plugins_play to load vars for managed_node2 28173 1726882762.18469: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882762.18472: Calling groups_plugins_play to load vars for managed_node2 28173 1726882762.19562: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000027 28173 1726882762.19568: WORKER PROCESS EXITING 28173 1726882762.20189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882762.21986: done with get_vars() 28173 1726882762.22011: done getting variables 28173 1726882762.22077: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:39:22 -0400 (0:00:00.058) 0:00:15.385 ****** 28173 1726882762.22110: entering _queue_task() for managed_node2/debug 28173 1726882762.22372: worker is 1 (out of 1 available) 28173 1726882762.22386: exiting _queue_task() for managed_node2/debug 28173 1726882762.22398: done queuing things up, now waiting for results queue to drain 28173 1726882762.22399: waiting for pending results... 28173 1726882762.22886: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28173 1726882762.23020: in run() - task 0e448fcc-3ce9-926c-8928-000000000028 28173 1726882762.23039: variable 'ansible_search_path' from source: unknown 28173 1726882762.23046: variable 'ansible_search_path' from source: unknown 28173 1726882762.23090: calling self._execute() 28173 1726882762.23188: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.23200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.23215: variable 'omit' from source: magic vars 28173 1726882762.23968: variable 'ansible_distribution_major_version' from source: facts 28173 1726882762.23987: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882762.23999: variable 'omit' from source: magic vars 28173 1726882762.24112: variable 'omit' from source: magic vars 28173 1726882762.24197: variable 'omit' from source: magic vars 28173 1726882762.24310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882762.24346: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882762.24400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882762.24462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882762.24502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882762.24567: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882762.24716: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.24724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.24878: Set connection var ansible_pipelining to False 28173 1726882762.24892: Set connection var ansible_shell_type to sh 28173 1726882762.24909: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882762.24923: Set connection var ansible_timeout to 10 28173 1726882762.24935: Set connection var ansible_shell_executable to /bin/sh 28173 1726882762.24944: Set connection var ansible_connection to ssh 28173 1726882762.24974: variable 'ansible_shell_executable' from source: unknown 28173 1726882762.24984: variable 'ansible_connection' from source: unknown 28173 1726882762.25010: variable 'ansible_module_compression' from source: unknown 28173 1726882762.25017: variable 'ansible_shell_type' from source: unknown 28173 1726882762.25023: variable 'ansible_shell_executable' from source: unknown 28173 1726882762.25029: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.25041: variable 'ansible_pipelining' from source: unknown 28173 1726882762.25047: variable 'ansible_timeout' from source: unknown 28173 1726882762.25054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.25209: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882762.25225: variable 'omit' from source: magic vars 28173 1726882762.25235: starting attempt loop 28173 1726882762.25242: running the handler 28173 1726882762.25377: variable '__network_connections_result' from source: set_fact 28173 1726882762.25437: handler run complete 28173 1726882762.25461: attempt loop complete, returning result 28173 1726882762.25473: _execute() done 28173 1726882762.25480: dumping result to json 28173 1726882762.25487: done dumping result, returning 28173 1726882762.25499: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-926c-8928-000000000028] 28173 1726882762.25508: sending task result for task 0e448fcc-3ce9-926c-8928-000000000028 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 (not-active)" ] } 28173 1726882762.25655: no more pending results, returning what we have 28173 1726882762.25658: results queue empty 28173 1726882762.25659: checking for any_errors_fatal 28173 1726882762.25668: done checking for any_errors_fatal 28173 1726882762.25669: checking for max_fail_percentage 28173 1726882762.25670: done checking for max_fail_percentage 28173 1726882762.25671: checking to see if all hosts have failed and the running result is not ok 28173 1726882762.25672: done checking to see if all hosts have failed 28173 1726882762.25673: getting the remaining hosts for this loop 28173 1726882762.25675: done getting the remaining hosts for this loop 28173 1726882762.25678: getting the next task for host managed_node2 28173 1726882762.25684: done getting next task for host managed_node2 28173 1726882762.25689: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28173 1726882762.25692: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882762.25703: getting variables 28173 1726882762.25705: in VariableManager get_vars() 28173 1726882762.25744: Calling all_inventory to load vars for managed_node2 28173 1726882762.25747: Calling groups_inventory to load vars for managed_node2 28173 1726882762.25749: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882762.25759: Calling all_plugins_play to load vars for managed_node2 28173 1726882762.25762: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882762.25767: Calling groups_plugins_play to load vars for managed_node2 28173 1726882762.26782: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000028 28173 1726882762.26785: WORKER PROCESS EXITING 28173 1726882762.28082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882762.29896: done with get_vars() 28173 1726882762.29918: done getting variables 28173 1726882762.29977: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:39:22 -0400 (0:00:00.078) 0:00:15.464 ****** 28173 1726882762.30010: entering _queue_task() for managed_node2/debug 28173 1726882762.30266: worker is 1 (out of 1 available) 28173 1726882762.30279: exiting _queue_task() for managed_node2/debug 28173 1726882762.30292: done queuing things up, now waiting for results queue to drain 28173 1726882762.30294: waiting for pending results... 28173 1726882762.30570: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28173 1726882762.30704: in run() - task 0e448fcc-3ce9-926c-8928-000000000029 28173 1726882762.30726: variable 'ansible_search_path' from source: unknown 28173 1726882762.30739: variable 'ansible_search_path' from source: unknown 28173 1726882762.30784: calling self._execute() 28173 1726882762.30878: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.30888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.30902: variable 'omit' from source: magic vars 28173 1726882762.31255: variable 'ansible_distribution_major_version' from source: facts 28173 1726882762.31277: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882762.31292: variable 'omit' from source: magic vars 28173 1726882762.31346: variable 'omit' from source: magic vars 28173 1726882762.31388: variable 'omit' from source: magic vars 28173 1726882762.31434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882762.31472: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882762.31496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882762.31522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882762.31537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882762.31571: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882762.31580: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.31597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.31696: Set connection var ansible_pipelining to False 28173 1726882762.31700: Set connection var ansible_shell_type to sh 28173 1726882762.31705: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882762.31717: Set connection var ansible_timeout to 10 28173 1726882762.31720: Set connection var ansible_shell_executable to /bin/sh 28173 1726882762.31724: Set connection var ansible_connection to ssh 28173 1726882762.31756: variable 'ansible_shell_executable' from source: unknown 28173 1726882762.31765: variable 'ansible_connection' from source: unknown 28173 1726882762.31772: variable 'ansible_module_compression' from source: unknown 28173 1726882762.31775: variable 'ansible_shell_type' from source: unknown 28173 1726882762.31777: variable 'ansible_shell_executable' from source: unknown 28173 1726882762.31779: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.31784: variable 'ansible_pipelining' from source: unknown 28173 1726882762.31790: variable 'ansible_timeout' from source: unknown 28173 1726882762.31793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.31899: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882762.31907: variable 'omit' from source: magic vars 28173 1726882762.31912: starting attempt loop 28173 1726882762.31915: running the handler 28173 1726882762.31962: variable '__network_connections_result' from source: set_fact 28173 1726882762.32017: variable '__network_connections_result' from source: set_fact 28173 1726882762.32142: handler run complete 28173 1726882762.32171: attempt loop complete, returning result 28173 1726882762.32175: _execute() done 28173 1726882762.32177: dumping result to json 28173 1726882762.32181: done dumping result, returning 28173 1726882762.32191: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-926c-8928-000000000029] 28173 1726882762.32197: sending task result for task 0e448fcc-3ce9-926c-8928-000000000029 28173 1726882762.32293: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000029 28173 1726882762.32296: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": 30200 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 (not-active)" ] } } 28173 1726882762.32398: no more pending results, returning what we have 28173 1726882762.32401: results queue empty 28173 1726882762.32402: checking for any_errors_fatal 28173 1726882762.32408: done checking for any_errors_fatal 28173 1726882762.32409: checking for max_fail_percentage 28173 1726882762.32411: done checking for max_fail_percentage 28173 1726882762.32411: checking to see if all hosts have failed and the running result is not ok 28173 1726882762.32412: done checking to see if all hosts have failed 28173 1726882762.32413: getting the remaining hosts for this loop 28173 1726882762.32419: done getting the remaining hosts for this loop 28173 1726882762.32422: getting the next task for host managed_node2 28173 1726882762.32427: done getting next task for host managed_node2 28173 1726882762.32431: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28173 1726882762.32433: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882762.32443: getting variables 28173 1726882762.32445: in VariableManager get_vars() 28173 1726882762.32479: Calling all_inventory to load vars for managed_node2 28173 1726882762.32481: Calling groups_inventory to load vars for managed_node2 28173 1726882762.32482: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882762.32489: Calling all_plugins_play to load vars for managed_node2 28173 1726882762.32491: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882762.32492: Calling groups_plugins_play to load vars for managed_node2 28173 1726882762.33278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882762.34776: done with get_vars() 28173 1726882762.34800: done getting variables 28173 1726882762.34841: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:39:22 -0400 (0:00:00.048) 0:00:15.513 ****** 28173 1726882762.34865: entering _queue_task() for managed_node2/debug 28173 1726882762.35056: worker is 1 (out of 1 available) 28173 1726882762.35070: exiting _queue_task() for managed_node2/debug 28173 1726882762.35083: done queuing things up, now waiting for results queue to drain 28173 1726882762.35084: waiting for pending results... 28173 1726882762.35253: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28173 1726882762.35341: in run() - task 0e448fcc-3ce9-926c-8928-00000000002a 28173 1726882762.35353: variable 'ansible_search_path' from source: unknown 28173 1726882762.35357: variable 'ansible_search_path' from source: unknown 28173 1726882762.35391: calling self._execute() 28173 1726882762.35457: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.35461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.35477: variable 'omit' from source: magic vars 28173 1726882762.35741: variable 'ansible_distribution_major_version' from source: facts 28173 1726882762.35750: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882762.35835: variable 'network_state' from source: role '' defaults 28173 1726882762.35844: Evaluated conditional (network_state != {}): False 28173 1726882762.35847: when evaluation is False, skipping this task 28173 1726882762.35851: _execute() done 28173 1726882762.35854: dumping result to json 28173 1726882762.35856: done dumping result, returning 28173 1726882762.35859: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-926c-8928-00000000002a] 28173 1726882762.35866: sending task result for task 0e448fcc-3ce9-926c-8928-00000000002a 28173 1726882762.35952: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000002a 28173 1726882762.35954: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28173 1726882762.36007: no more pending results, returning what we have 28173 1726882762.36010: results queue empty 28173 1726882762.36011: checking for any_errors_fatal 28173 1726882762.36018: done checking for any_errors_fatal 28173 1726882762.36019: checking for max_fail_percentage 28173 1726882762.36020: done checking for max_fail_percentage 28173 1726882762.36021: checking to see if all hosts have failed and the running result is not ok 28173 1726882762.36022: done checking to see if all hosts have failed 28173 1726882762.36022: getting the remaining hosts for this loop 28173 1726882762.36023: done getting the remaining hosts for this loop 28173 1726882762.36026: getting the next task for host managed_node2 28173 1726882762.36031: done getting next task for host managed_node2 28173 1726882762.36034: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28173 1726882762.36037: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882762.36050: getting variables 28173 1726882762.36051: in VariableManager get_vars() 28173 1726882762.36088: Calling all_inventory to load vars for managed_node2 28173 1726882762.36090: Calling groups_inventory to load vars for managed_node2 28173 1726882762.36092: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882762.36098: Calling all_plugins_play to load vars for managed_node2 28173 1726882762.36100: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882762.36101: Calling groups_plugins_play to load vars for managed_node2 28173 1726882762.37160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882762.38790: done with get_vars() 28173 1726882762.38812: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:39:22 -0400 (0:00:00.040) 0:00:15.553 ****** 28173 1726882762.38885: entering _queue_task() for managed_node2/ping 28173 1726882762.38887: Creating lock for ping 28173 1726882762.39129: worker is 1 (out of 1 available) 28173 1726882762.39141: exiting _queue_task() for managed_node2/ping 28173 1726882762.39154: done queuing things up, now waiting for results queue to drain 28173 1726882762.39155: waiting for pending results... 28173 1726882762.39332: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28173 1726882762.39418: in run() - task 0e448fcc-3ce9-926c-8928-00000000002b 28173 1726882762.39430: variable 'ansible_search_path' from source: unknown 28173 1726882762.39433: variable 'ansible_search_path' from source: unknown 28173 1726882762.39461: calling self._execute() 28173 1726882762.39534: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.39538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.39548: variable 'omit' from source: magic vars 28173 1726882762.39815: variable 'ansible_distribution_major_version' from source: facts 28173 1726882762.39826: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882762.39833: variable 'omit' from source: magic vars 28173 1726882762.39871: variable 'omit' from source: magic vars 28173 1726882762.39899: variable 'omit' from source: magic vars 28173 1726882762.39935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882762.39960: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882762.39981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882762.39994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882762.40004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882762.40028: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882762.40031: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.40034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.40107: Set connection var ansible_pipelining to False 28173 1726882762.40111: Set connection var ansible_shell_type to sh 28173 1726882762.40116: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882762.40122: Set connection var ansible_timeout to 10 28173 1726882762.40127: Set connection var ansible_shell_executable to /bin/sh 28173 1726882762.40132: Set connection var ansible_connection to ssh 28173 1726882762.40151: variable 'ansible_shell_executable' from source: unknown 28173 1726882762.40157: variable 'ansible_connection' from source: unknown 28173 1726882762.40160: variable 'ansible_module_compression' from source: unknown 28173 1726882762.40168: variable 'ansible_shell_type' from source: unknown 28173 1726882762.40171: variable 'ansible_shell_executable' from source: unknown 28173 1726882762.40174: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.40176: variable 'ansible_pipelining' from source: unknown 28173 1726882762.40178: variable 'ansible_timeout' from source: unknown 28173 1726882762.40181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.40324: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882762.40333: variable 'omit' from source: magic vars 28173 1726882762.40336: starting attempt loop 28173 1726882762.40339: running the handler 28173 1726882762.40350: _low_level_execute_command(): starting 28173 1726882762.40357: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882762.41045: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28173 1726882762.41068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882762.41091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882762.41219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882762.42883: stdout chunk (state=3): >>>/root <<< 28173 1726882762.42986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882762.43030: stderr chunk (state=3): >>><<< 28173 1726882762.43034: stdout chunk (state=3): >>><<< 28173 1726882762.43050: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882762.43062: _low_level_execute_command(): starting 28173 1726882762.43072: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882762.4305165-28894-123442139387290 `" && echo ansible-tmp-1726882762.4305165-28894-123442139387290="` echo /root/.ansible/tmp/ansible-tmp-1726882762.4305165-28894-123442139387290 `" ) && sleep 0' 28173 1726882762.43493: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882762.43496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.43528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.43533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.43542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.43591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882762.43595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882762.43704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882762.45581: stdout chunk (state=3): >>>ansible-tmp-1726882762.4305165-28894-123442139387290=/root/.ansible/tmp/ansible-tmp-1726882762.4305165-28894-123442139387290 <<< 28173 1726882762.45695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882762.45736: stderr chunk (state=3): >>><<< 28173 1726882762.45739: stdout chunk (state=3): >>><<< 28173 1726882762.45751: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882762.4305165-28894-123442139387290=/root/.ansible/tmp/ansible-tmp-1726882762.4305165-28894-123442139387290 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882762.45786: variable 'ansible_module_compression' from source: unknown 28173 1726882762.45815: ANSIBALLZ: Using lock for ping 28173 1726882762.45818: ANSIBALLZ: Acquiring lock 28173 1726882762.45821: ANSIBALLZ: Lock acquired: 140243972859104 28173 1726882762.45828: ANSIBALLZ: Creating module 28173 1726882762.57431: ANSIBALLZ: Writing module into payload 28173 1726882762.57499: ANSIBALLZ: Writing module 28173 1726882762.57522: ANSIBALLZ: Renaming module 28173 1726882762.57526: ANSIBALLZ: Done creating module 28173 1726882762.57543: variable 'ansible_facts' from source: unknown 28173 1726882762.57612: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882762.4305165-28894-123442139387290/AnsiballZ_ping.py 28173 1726882762.57749: Sending initial data 28173 1726882762.57752: Sent initial data (153 bytes) 28173 1726882762.58720: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882762.58726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882762.58739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.58750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.58790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882762.58799: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882762.58811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.58821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882762.58827: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882762.58834: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882762.58844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882762.58850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.58866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.58874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882762.58881: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882762.58891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.58960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882762.58984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882762.58997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882762.59128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882762.60966: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882762.61054: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882762.61149: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpe40z_v_g /root/.ansible/tmp/ansible-tmp-1726882762.4305165-28894-123442139387290/AnsiballZ_ping.py <<< 28173 1726882762.61239: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882762.62469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882762.62554: stderr chunk (state=3): >>><<< 28173 1726882762.62558: stdout chunk (state=3): >>><<< 28173 1726882762.62582: done transferring module to remote 28173 1726882762.62592: _low_level_execute_command(): starting 28173 1726882762.62595: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882762.4305165-28894-123442139387290/ /root/.ansible/tmp/ansible-tmp-1726882762.4305165-28894-123442139387290/AnsiballZ_ping.py && sleep 0' 28173 1726882762.63031: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.63035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.63063: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882762.63072: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882762.63085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.63096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882762.63103: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882762.63108: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882762.63117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.63126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.63134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882762.63140: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.63195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882762.63216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882762.63223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882762.63320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882762.65143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882762.65205: stderr chunk (state=3): >>><<< 28173 1726882762.65209: stdout chunk (state=3): >>><<< 28173 1726882762.65302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882762.65306: _low_level_execute_command(): starting 28173 1726882762.65309: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882762.4305165-28894-123442139387290/AnsiballZ_ping.py && sleep 0' 28173 1726882762.66303: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.66307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.66323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882762.66332: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882762.66335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.66387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882762.66393: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882762.66405: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882762.66413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882762.66419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.66439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.66442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882762.66445: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882762.66451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.66595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882762.66614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882762.66630: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882762.66753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882762.79856: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28173 1726882762.80817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882762.80878: stderr chunk (state=3): >>><<< 28173 1726882762.80882: stdout chunk (state=3): >>><<< 28173 1726882762.80897: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882762.80916: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882762.4305165-28894-123442139387290/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882762.80923: _low_level_execute_command(): starting 28173 1726882762.80928: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882762.4305165-28894-123442139387290/ > /dev/null 2>&1 && sleep 0' 28173 1726882762.81386: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882762.81389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.81441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882762.81445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.81452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882762.81454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.81518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882762.81521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882762.81523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882762.81615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882762.83423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882762.83471: stderr chunk (state=3): >>><<< 28173 1726882762.83474: stdout chunk (state=3): >>><<< 28173 1726882762.83488: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882762.83494: handler run complete 28173 1726882762.83505: attempt loop complete, returning result 28173 1726882762.83508: _execute() done 28173 1726882762.83510: dumping result to json 28173 1726882762.83516: done dumping result, returning 28173 1726882762.83526: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-926c-8928-00000000002b] 28173 1726882762.83531: sending task result for task 0e448fcc-3ce9-926c-8928-00000000002b 28173 1726882762.83618: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000002b 28173 1726882762.83621: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28173 1726882762.83678: no more pending results, returning what we have 28173 1726882762.83681: results queue empty 28173 1726882762.83682: checking for any_errors_fatal 28173 1726882762.83689: done checking for any_errors_fatal 28173 1726882762.83690: checking for max_fail_percentage 28173 1726882762.83691: done checking for max_fail_percentage 28173 1726882762.83692: checking to see if all hosts have failed and the running result is not ok 28173 1726882762.83693: done checking to see if all hosts have failed 28173 1726882762.83694: getting the remaining hosts for this loop 28173 1726882762.83695: done getting the remaining hosts for this loop 28173 1726882762.83699: getting the next task for host managed_node2 28173 1726882762.83707: done getting next task for host managed_node2 28173 1726882762.83709: ^ task is: TASK: meta (role_complete) 28173 1726882762.83712: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882762.83721: getting variables 28173 1726882762.83723: in VariableManager get_vars() 28173 1726882762.83770: Calling all_inventory to load vars for managed_node2 28173 1726882762.83773: Calling groups_inventory to load vars for managed_node2 28173 1726882762.83775: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882762.83785: Calling all_plugins_play to load vars for managed_node2 28173 1726882762.83788: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882762.83790: Calling groups_plugins_play to load vars for managed_node2 28173 1726882762.84654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882762.85687: done with get_vars() 28173 1726882762.85703: done getting variables 28173 1726882762.85760: done queuing things up, now waiting for results queue to drain 28173 1726882762.85762: results queue empty 28173 1726882762.85762: checking for any_errors_fatal 28173 1726882762.85767: done checking for any_errors_fatal 28173 1726882762.85768: checking for max_fail_percentage 28173 1726882762.85768: done checking for max_fail_percentage 28173 1726882762.85769: checking to see if all hosts have failed and the running result is not ok 28173 1726882762.85769: done checking to see if all hosts have failed 28173 1726882762.85770: getting the remaining hosts for this loop 28173 1726882762.85770: done getting the remaining hosts for this loop 28173 1726882762.85772: getting the next task for host managed_node2 28173 1726882762.85774: done getting next task for host managed_node2 28173 1726882762.85776: ^ task is: TASK: Get the routes from the route table 30200 28173 1726882762.85777: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882762.85778: getting variables 28173 1726882762.85779: in VariableManager get_vars() 28173 1726882762.85788: Calling all_inventory to load vars for managed_node2 28173 1726882762.85790: Calling groups_inventory to load vars for managed_node2 28173 1726882762.85791: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882762.85794: Calling all_plugins_play to load vars for managed_node2 28173 1726882762.85795: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882762.85797: Calling groups_plugins_play to load vars for managed_node2 28173 1726882762.86494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882762.87534: done with get_vars() 28173 1726882762.87571: done getting variables 28173 1726882762.87616: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routes from the route table 30200] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:56 Friday 20 September 2024 21:39:22 -0400 (0:00:00.487) 0:00:16.040 ****** 28173 1726882762.87642: entering _queue_task() for managed_node2/command 28173 1726882762.87953: worker is 1 (out of 1 available) 28173 1726882762.87970: exiting _queue_task() for managed_node2/command 28173 1726882762.87982: done queuing things up, now waiting for results queue to drain 28173 1726882762.87987: waiting for pending results... 28173 1726882762.88380: running TaskExecutor() for managed_node2/TASK: Get the routes from the route table 30200 28173 1726882762.88486: in run() - task 0e448fcc-3ce9-926c-8928-00000000005b 28173 1726882762.88503: variable 'ansible_search_path' from source: unknown 28173 1726882762.88544: calling self._execute() 28173 1726882762.88626: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.88630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.88638: variable 'omit' from source: magic vars 28173 1726882762.88916: variable 'ansible_distribution_major_version' from source: facts 28173 1726882762.88932: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882762.88937: variable 'omit' from source: magic vars 28173 1726882762.88952: variable 'omit' from source: magic vars 28173 1726882762.88985: variable 'omit' from source: magic vars 28173 1726882762.89016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882762.89043: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882762.89058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882762.89073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882762.89085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882762.89113: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882762.89116: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.89118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.89186: Set connection var ansible_pipelining to False 28173 1726882762.89189: Set connection var ansible_shell_type to sh 28173 1726882762.89196: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882762.89207: Set connection var ansible_timeout to 10 28173 1726882762.89212: Set connection var ansible_shell_executable to /bin/sh 28173 1726882762.89217: Set connection var ansible_connection to ssh 28173 1726882762.89233: variable 'ansible_shell_executable' from source: unknown 28173 1726882762.89236: variable 'ansible_connection' from source: unknown 28173 1726882762.89240: variable 'ansible_module_compression' from source: unknown 28173 1726882762.89243: variable 'ansible_shell_type' from source: unknown 28173 1726882762.89245: variable 'ansible_shell_executable' from source: unknown 28173 1726882762.89247: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882762.89249: variable 'ansible_pipelining' from source: unknown 28173 1726882762.89251: variable 'ansible_timeout' from source: unknown 28173 1726882762.89255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882762.89357: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882762.89368: variable 'omit' from source: magic vars 28173 1726882762.89377: starting attempt loop 28173 1726882762.89380: running the handler 28173 1726882762.89392: _low_level_execute_command(): starting 28173 1726882762.89399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882762.89889: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.89910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.89923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882762.89934: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.89983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882762.89994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882762.90103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882762.91745: stdout chunk (state=3): >>>/root <<< 28173 1726882762.91915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882762.91925: stdout chunk (state=3): >>><<< 28173 1726882762.91936: stderr chunk (state=3): >>><<< 28173 1726882762.91957: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882762.91986: _low_level_execute_command(): starting 28173 1726882762.91995: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882762.919703-28920-245858065659928 `" && echo ansible-tmp-1726882762.919703-28920-245858065659928="` echo /root/.ansible/tmp/ansible-tmp-1726882762.919703-28920-245858065659928 `" ) && sleep 0' 28173 1726882762.92636: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882762.92649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882762.92662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.92685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.92726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882762.92742: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882762.92756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.92785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882762.92796: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882762.92806: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882762.92817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882762.92831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.92853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.92869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882762.92883: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882762.92896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.92983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882762.93002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882762.93017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882762.93149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882762.95081: stdout chunk (state=3): >>>ansible-tmp-1726882762.919703-28920-245858065659928=/root/.ansible/tmp/ansible-tmp-1726882762.919703-28920-245858065659928 <<< 28173 1726882762.95269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882762.95272: stdout chunk (state=3): >>><<< 28173 1726882762.95274: stderr chunk (state=3): >>><<< 28173 1726882762.95571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882762.919703-28920-245858065659928=/root/.ansible/tmp/ansible-tmp-1726882762.919703-28920-245858065659928 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882762.95575: variable 'ansible_module_compression' from source: unknown 28173 1726882762.95578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28173 1726882762.95580: variable 'ansible_facts' from source: unknown 28173 1726882762.95582: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882762.919703-28920-245858065659928/AnsiballZ_command.py 28173 1726882762.95643: Sending initial data 28173 1726882762.95645: Sent initial data (155 bytes) 28173 1726882762.96688: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882762.96702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882762.96723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.96742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.96806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882762.96825: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882762.96840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.96858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882762.96876: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882762.96894: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882762.96917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882762.96935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882762.96957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882762.96974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882762.97005: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882762.97021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882762.97138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882762.97159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882762.97178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882762.97318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882762.99071: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882762.99162: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882762.99260: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp_wcxrvnp /root/.ansible/tmp/ansible-tmp-1726882762.919703-28920-245858065659928/AnsiballZ_command.py <<< 28173 1726882762.99353: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882763.00630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882763.00772: stderr chunk (state=3): >>><<< 28173 1726882763.00775: stdout chunk (state=3): >>><<< 28173 1726882763.00851: done transferring module to remote 28173 1726882763.00855: _low_level_execute_command(): starting 28173 1726882763.00857: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882762.919703-28920-245858065659928/ /root/.ansible/tmp/ansible-tmp-1726882762.919703-28920-245858065659928/AnsiballZ_command.py && sleep 0' 28173 1726882763.01471: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882763.01488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.01504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.01527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.01580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882763.01597: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882763.01612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.01635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882763.01652: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882763.01668: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882763.01683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.01698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.01715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.01729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882763.01744: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882763.01761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.01841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882763.01870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882763.01891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882763.02018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882763.03803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882763.03891: stderr chunk (state=3): >>><<< 28173 1726882763.03901: stdout chunk (state=3): >>><<< 28173 1726882763.04007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882763.04010: _low_level_execute_command(): starting 28173 1726882763.04013: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882762.919703-28920-245858065659928/AnsiballZ_command.py && sleep 0' 28173 1726882763.04722: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882763.04762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.04786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.04804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.04845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882763.04878: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882763.04900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.04917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882763.04929: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882763.04940: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882763.04952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.04968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.04985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.05011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882763.05024: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882763.05038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.05123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882763.05143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882763.05166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882763.05304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882763.18934: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30200"], "start": "2024-09-20 21:39:23.183469", "end": "2024-09-20 21:39:23.187155", "delta": "0:00:00.003686", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882763.20191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882763.20248: stderr chunk (state=3): >>><<< 28173 1726882763.20251: stdout chunk (state=3): >>><<< 28173 1726882763.20401: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30200"], "start": "2024-09-20 21:39:23.183469", "end": "2024-09-20 21:39:23.187155", "delta": "0:00:00.003686", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882763.20411: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route show table 30200', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882762.919703-28920-245858065659928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882763.20413: _low_level_execute_command(): starting 28173 1726882763.20416: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882762.919703-28920-245858065659928/ > /dev/null 2>&1 && sleep 0' 28173 1726882763.21417: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.21421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.21459: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882763.21468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.21472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882763.21474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.21539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882763.21542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882763.21544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882763.21659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882763.23571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882763.23590: stdout chunk (state=3): >>><<< 28173 1726882763.23592: stderr chunk (state=3): >>><<< 28173 1726882763.23776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882763.23780: handler run complete 28173 1726882763.23782: Evaluated conditional (False): False 28173 1726882763.23785: attempt loop complete, returning result 28173 1726882763.23786: _execute() done 28173 1726882763.23788: dumping result to json 28173 1726882763.23790: done dumping result, returning 28173 1726882763.23792: done running TaskExecutor() for managed_node2/TASK: Get the routes from the route table 30200 [0e448fcc-3ce9-926c-8928-00000000005b] 28173 1726882763.23794: sending task result for task 0e448fcc-3ce9-926c-8928-00000000005b 28173 1726882763.23883: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000005b 28173 1726882763.23887: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "route", "show", "table", "30200" ], "delta": "0:00:00.003686", "end": "2024-09-20 21:39:23.187155", "rc": 0, "start": "2024-09-20 21:39:23.183469" } STDOUT: 192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 28173 1726882763.23992: no more pending results, returning what we have 28173 1726882763.23997: results queue empty 28173 1726882763.23997: checking for any_errors_fatal 28173 1726882763.24000: done checking for any_errors_fatal 28173 1726882763.24001: checking for max_fail_percentage 28173 1726882763.24003: done checking for max_fail_percentage 28173 1726882763.24004: checking to see if all hosts have failed and the running result is not ok 28173 1726882763.24005: done checking to see if all hosts have failed 28173 1726882763.24006: getting the remaining hosts for this loop 28173 1726882763.24007: done getting the remaining hosts for this loop 28173 1726882763.24011: getting the next task for host managed_node2 28173 1726882763.24017: done getting next task for host managed_node2 28173 1726882763.24020: ^ task is: TASK: Get the routes from the route table 30400 28173 1726882763.24023: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882763.24028: getting variables 28173 1726882763.24029: in VariableManager get_vars() 28173 1726882763.24080: Calling all_inventory to load vars for managed_node2 28173 1726882763.24083: Calling groups_inventory to load vars for managed_node2 28173 1726882763.24086: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882763.24098: Calling all_plugins_play to load vars for managed_node2 28173 1726882763.24102: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882763.24105: Calling groups_plugins_play to load vars for managed_node2 28173 1726882763.26150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882763.32487: done with get_vars() 28173 1726882763.32510: done getting variables 28173 1726882763.32556: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routes from the route table 30400] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:62 Friday 20 September 2024 21:39:23 -0400 (0:00:00.449) 0:00:16.490 ****** 28173 1726882763.32583: entering _queue_task() for managed_node2/command 28173 1726882763.32911: worker is 1 (out of 1 available) 28173 1726882763.32928: exiting _queue_task() for managed_node2/command 28173 1726882763.32940: done queuing things up, now waiting for results queue to drain 28173 1726882763.32942: waiting for pending results... 28173 1726882763.33234: running TaskExecutor() for managed_node2/TASK: Get the routes from the route table 30400 28173 1726882763.33333: in run() - task 0e448fcc-3ce9-926c-8928-00000000005c 28173 1726882763.33352: variable 'ansible_search_path' from source: unknown 28173 1726882763.33407: calling self._execute() 28173 1726882763.33532: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882763.33544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882763.33559: variable 'omit' from source: magic vars 28173 1726882763.34052: variable 'ansible_distribution_major_version' from source: facts 28173 1726882763.34091: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882763.34104: variable 'omit' from source: magic vars 28173 1726882763.34137: variable 'omit' from source: magic vars 28173 1726882763.34186: variable 'omit' from source: magic vars 28173 1726882763.34239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882763.34292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882763.34316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882763.34339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882763.34370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882763.34405: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882763.34415: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882763.34424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882763.34551: Set connection var ansible_pipelining to False 28173 1726882763.34559: Set connection var ansible_shell_type to sh 28173 1726882763.34585: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882763.34598: Set connection var ansible_timeout to 10 28173 1726882763.34608: Set connection var ansible_shell_executable to /bin/sh 28173 1726882763.34618: Set connection var ansible_connection to ssh 28173 1726882763.34652: variable 'ansible_shell_executable' from source: unknown 28173 1726882763.34660: variable 'ansible_connection' from source: unknown 28173 1726882763.34674: variable 'ansible_module_compression' from source: unknown 28173 1726882763.34689: variable 'ansible_shell_type' from source: unknown 28173 1726882763.34699: variable 'ansible_shell_executable' from source: unknown 28173 1726882763.34705: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882763.34713: variable 'ansible_pipelining' from source: unknown 28173 1726882763.34718: variable 'ansible_timeout' from source: unknown 28173 1726882763.34725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882763.34878: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882763.34899: variable 'omit' from source: magic vars 28173 1726882763.34916: starting attempt loop 28173 1726882763.34923: running the handler 28173 1726882763.34942: _low_level_execute_command(): starting 28173 1726882763.34956: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882763.35781: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882763.35801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.35817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.35837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.35887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882763.35906: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882763.35922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.35942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882763.35955: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882763.35973: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882763.35988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.36002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.36022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.36033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882763.36043: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882763.36054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.36137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882763.36158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882763.36176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882763.36313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882763.37978: stdout chunk (state=3): >>>/root <<< 28173 1726882763.38084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882763.38172: stderr chunk (state=3): >>><<< 28173 1726882763.38184: stdout chunk (state=3): >>><<< 28173 1726882763.38281: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882763.38284: _low_level_execute_command(): starting 28173 1726882763.38287: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882763.382164-28933-166610765191148 `" && echo ansible-tmp-1726882763.382164-28933-166610765191148="` echo /root/.ansible/tmp/ansible-tmp-1726882763.382164-28933-166610765191148 `" ) && sleep 0' 28173 1726882763.39286: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.39290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.39322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.39326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.39328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.39396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882763.39400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882763.39519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882763.41397: stdout chunk (state=3): >>>ansible-tmp-1726882763.382164-28933-166610765191148=/root/.ansible/tmp/ansible-tmp-1726882763.382164-28933-166610765191148 <<< 28173 1726882763.41591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882763.41595: stdout chunk (state=3): >>><<< 28173 1726882763.41597: stderr chunk (state=3): >>><<< 28173 1726882763.41677: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882763.382164-28933-166610765191148=/root/.ansible/tmp/ansible-tmp-1726882763.382164-28933-166610765191148 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882763.41680: variable 'ansible_module_compression' from source: unknown 28173 1726882763.41876: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28173 1726882763.41880: variable 'ansible_facts' from source: unknown 28173 1726882763.41882: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882763.382164-28933-166610765191148/AnsiballZ_command.py 28173 1726882763.42030: Sending initial data 28173 1726882763.42033: Sent initial data (155 bytes) 28173 1726882763.43111: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882763.43126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.43142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.43169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.43217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882763.43231: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882763.43245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.43260: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882763.43278: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882763.43293: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882763.43303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.43314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.43327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.43336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882763.43345: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882763.43355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.43440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882763.43459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882763.43479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882763.43613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882763.45409: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882763.45504: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882763.45609: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpsmiu6_dw /root/.ansible/tmp/ansible-tmp-1726882763.382164-28933-166610765191148/AnsiballZ_command.py <<< 28173 1726882763.45698: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882763.47036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882763.47171: stderr chunk (state=3): >>><<< 28173 1726882763.47174: stdout chunk (state=3): >>><<< 28173 1726882763.47178: done transferring module to remote 28173 1726882763.47181: _low_level_execute_command(): starting 28173 1726882763.47183: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882763.382164-28933-166610765191148/ /root/.ansible/tmp/ansible-tmp-1726882763.382164-28933-166610765191148/AnsiballZ_command.py && sleep 0' 28173 1726882763.47589: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.47593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.47623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882763.47630: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882763.47638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.47650: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882763.47657: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882763.47662: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.47678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.47688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.47694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882763.47699: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882763.47707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.47757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882763.47772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882763.47780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882763.47893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882763.49706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882763.49743: stderr chunk (state=3): >>><<< 28173 1726882763.49746: stdout chunk (state=3): >>><<< 28173 1726882763.49758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882763.49761: _low_level_execute_command(): starting 28173 1726882763.49768: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882763.382164-28933-166610765191148/AnsiballZ_command.py && sleep 0' 28173 1726882763.50159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.50167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.50199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.50203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.50221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.50224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.50284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882763.50288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882763.50290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882763.50400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882763.64015: stdout chunk (state=3): >>> {"changed": true, "stdout": "198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30400"], "start": "2024-09-20 21:39:23.634778", "end": "2024-09-20 21:39:23.638254", "delta": "0:00:00.003476", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882763.65213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882763.65257: stderr chunk (state=3): >>><<< 28173 1726882763.65260: stdout chunk (state=3): >>><<< 28173 1726882763.65277: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30400"], "start": "2024-09-20 21:39:23.634778", "end": "2024-09-20 21:39:23.638254", "delta": "0:00:00.003476", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882763.65310: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route show table 30400', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882763.382164-28933-166610765191148/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882763.65318: _low_level_execute_command(): starting 28173 1726882763.65321: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882763.382164-28933-166610765191148/ > /dev/null 2>&1 && sleep 0' 28173 1726882763.65735: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.65740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.65773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882763.65779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.65788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882763.65796: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882763.65801: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.65809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.65817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.65822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882763.65827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.65886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882763.65889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882763.65901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882763.66010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882763.67882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882763.67949: stderr chunk (state=3): >>><<< 28173 1726882763.67952: stdout chunk (state=3): >>><<< 28173 1726882763.68177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882763.68180: handler run complete 28173 1726882763.68183: Evaluated conditional (False): False 28173 1726882763.68185: attempt loop complete, returning result 28173 1726882763.68187: _execute() done 28173 1726882763.68189: dumping result to json 28173 1726882763.68191: done dumping result, returning 28173 1726882763.68193: done running TaskExecutor() for managed_node2/TASK: Get the routes from the route table 30400 [0e448fcc-3ce9-926c-8928-00000000005c] 28173 1726882763.68195: sending task result for task 0e448fcc-3ce9-926c-8928-00000000005c 28173 1726882763.68286: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000005c 28173 1726882763.68290: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "route", "show", "table", "30400" ], "delta": "0:00:00.003476", "end": "2024-09-20 21:39:23.638254", "rc": 0, "start": "2024-09-20 21:39:23.634778" } STDOUT: 198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 28173 1726882763.68377: no more pending results, returning what we have 28173 1726882763.68380: results queue empty 28173 1726882763.68381: checking for any_errors_fatal 28173 1726882763.68393: done checking for any_errors_fatal 28173 1726882763.68393: checking for max_fail_percentage 28173 1726882763.68395: done checking for max_fail_percentage 28173 1726882763.68396: checking to see if all hosts have failed and the running result is not ok 28173 1726882763.68397: done checking to see if all hosts have failed 28173 1726882763.68398: getting the remaining hosts for this loop 28173 1726882763.68402: done getting the remaining hosts for this loop 28173 1726882763.68406: getting the next task for host managed_node2 28173 1726882763.68412: done getting next task for host managed_node2 28173 1726882763.68415: ^ task is: TASK: Assert that the route table 30200 contains the specified route 28173 1726882763.68417: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882763.68421: getting variables 28173 1726882763.68423: in VariableManager get_vars() 28173 1726882763.68474: Calling all_inventory to load vars for managed_node2 28173 1726882763.68477: Calling groups_inventory to load vars for managed_node2 28173 1726882763.68480: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882763.68492: Calling all_plugins_play to load vars for managed_node2 28173 1726882763.68495: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882763.68498: Calling groups_plugins_play to load vars for managed_node2 28173 1726882763.69539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882763.70484: done with get_vars() 28173 1726882763.70500: done getting variables 28173 1726882763.70542: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the route table 30200 contains the specified route] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:68 Friday 20 September 2024 21:39:23 -0400 (0:00:00.379) 0:00:16.870 ****** 28173 1726882763.70562: entering _queue_task() for managed_node2/assert 28173 1726882763.70762: worker is 1 (out of 1 available) 28173 1726882763.70778: exiting _queue_task() for managed_node2/assert 28173 1726882763.70790: done queuing things up, now waiting for results queue to drain 28173 1726882763.70791: waiting for pending results... 28173 1726882763.70963: running TaskExecutor() for managed_node2/TASK: Assert that the route table 30200 contains the specified route 28173 1726882763.71023: in run() - task 0e448fcc-3ce9-926c-8928-00000000005d 28173 1726882763.71035: variable 'ansible_search_path' from source: unknown 28173 1726882763.71067: calling self._execute() 28173 1726882763.71136: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882763.71140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882763.71150: variable 'omit' from source: magic vars 28173 1726882763.71417: variable 'ansible_distribution_major_version' from source: facts 28173 1726882763.71427: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882763.71433: variable 'omit' from source: magic vars 28173 1726882763.71449: variable 'omit' from source: magic vars 28173 1726882763.71483: variable 'omit' from source: magic vars 28173 1726882763.71514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882763.71543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882763.71561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882763.71579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882763.71591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882763.71615: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882763.71618: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882763.71622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882763.71697: Set connection var ansible_pipelining to False 28173 1726882763.71700: Set connection var ansible_shell_type to sh 28173 1726882763.71705: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882763.71713: Set connection var ansible_timeout to 10 28173 1726882763.71718: Set connection var ansible_shell_executable to /bin/sh 28173 1726882763.71723: Set connection var ansible_connection to ssh 28173 1726882763.71740: variable 'ansible_shell_executable' from source: unknown 28173 1726882763.71743: variable 'ansible_connection' from source: unknown 28173 1726882763.71746: variable 'ansible_module_compression' from source: unknown 28173 1726882763.71749: variable 'ansible_shell_type' from source: unknown 28173 1726882763.71751: variable 'ansible_shell_executable' from source: unknown 28173 1726882763.71753: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882763.71756: variable 'ansible_pipelining' from source: unknown 28173 1726882763.71758: variable 'ansible_timeout' from source: unknown 28173 1726882763.71760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882763.71859: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882763.71870: variable 'omit' from source: magic vars 28173 1726882763.71875: starting attempt loop 28173 1726882763.71878: running the handler 28173 1726882763.71991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882763.72161: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882763.72194: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882763.72235: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882763.72260: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882763.72321: variable 'route_table_30200' from source: set_fact 28173 1726882763.72343: Evaluated conditional (route_table_30200.stdout is search("198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4")): True 28173 1726882763.72434: variable 'route_table_30200' from source: set_fact 28173 1726882763.72454: Evaluated conditional (route_table_30200.stdout is search("192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50")): True 28173 1726882763.72459: handler run complete 28173 1726882763.72472: attempt loop complete, returning result 28173 1726882763.72475: _execute() done 28173 1726882763.72478: dumping result to json 28173 1726882763.72480: done dumping result, returning 28173 1726882763.72486: done running TaskExecutor() for managed_node2/TASK: Assert that the route table 30200 contains the specified route [0e448fcc-3ce9-926c-8928-00000000005d] 28173 1726882763.72495: sending task result for task 0e448fcc-3ce9-926c-8928-00000000005d 28173 1726882763.72585: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000005d 28173 1726882763.72588: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28173 1726882763.72635: no more pending results, returning what we have 28173 1726882763.72638: results queue empty 28173 1726882763.72639: checking for any_errors_fatal 28173 1726882763.72644: done checking for any_errors_fatal 28173 1726882763.72645: checking for max_fail_percentage 28173 1726882763.72646: done checking for max_fail_percentage 28173 1726882763.72647: checking to see if all hosts have failed and the running result is not ok 28173 1726882763.72648: done checking to see if all hosts have failed 28173 1726882763.72649: getting the remaining hosts for this loop 28173 1726882763.72650: done getting the remaining hosts for this loop 28173 1726882763.72652: getting the next task for host managed_node2 28173 1726882763.72656: done getting next task for host managed_node2 28173 1726882763.72658: ^ task is: TASK: Assert that the route table 30400 contains the specified route 28173 1726882763.72660: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882763.72669: getting variables 28173 1726882763.72672: in VariableManager get_vars() 28173 1726882763.72706: Calling all_inventory to load vars for managed_node2 28173 1726882763.72712: Calling groups_inventory to load vars for managed_node2 28173 1726882763.72715: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882763.72723: Calling all_plugins_play to load vars for managed_node2 28173 1726882763.72725: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882763.72728: Calling groups_plugins_play to load vars for managed_node2 28173 1726882763.73620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882763.74553: done with get_vars() 28173 1726882763.74571: done getting variables 28173 1726882763.74611: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the route table 30400 contains the specified route] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:76 Friday 20 September 2024 21:39:23 -0400 (0:00:00.040) 0:00:16.910 ****** 28173 1726882763.74636: entering _queue_task() for managed_node2/assert 28173 1726882763.74829: worker is 1 (out of 1 available) 28173 1726882763.74841: exiting _queue_task() for managed_node2/assert 28173 1726882763.74854: done queuing things up, now waiting for results queue to drain 28173 1726882763.74855: waiting for pending results... 28173 1726882763.75026: running TaskExecutor() for managed_node2/TASK: Assert that the route table 30400 contains the specified route 28173 1726882763.75088: in run() - task 0e448fcc-3ce9-926c-8928-00000000005e 28173 1726882763.75099: variable 'ansible_search_path' from source: unknown 28173 1726882763.75127: calling self._execute() 28173 1726882763.75202: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882763.75206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882763.75215: variable 'omit' from source: magic vars 28173 1726882763.75481: variable 'ansible_distribution_major_version' from source: facts 28173 1726882763.75492: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882763.75498: variable 'omit' from source: magic vars 28173 1726882763.75515: variable 'omit' from source: magic vars 28173 1726882763.75542: variable 'omit' from source: magic vars 28173 1726882763.75574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882763.75601: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882763.75617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882763.75630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882763.75639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882763.75662: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882763.75670: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882763.75673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882763.75736: Set connection var ansible_pipelining to False 28173 1726882763.75739: Set connection var ansible_shell_type to sh 28173 1726882763.75746: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882763.75752: Set connection var ansible_timeout to 10 28173 1726882763.75760: Set connection var ansible_shell_executable to /bin/sh 28173 1726882763.75766: Set connection var ansible_connection to ssh 28173 1726882763.75786: variable 'ansible_shell_executable' from source: unknown 28173 1726882763.75788: variable 'ansible_connection' from source: unknown 28173 1726882763.75791: variable 'ansible_module_compression' from source: unknown 28173 1726882763.75794: variable 'ansible_shell_type' from source: unknown 28173 1726882763.75797: variable 'ansible_shell_executable' from source: unknown 28173 1726882763.75799: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882763.75801: variable 'ansible_pipelining' from source: unknown 28173 1726882763.75803: variable 'ansible_timeout' from source: unknown 28173 1726882763.75807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882763.75911: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882763.75920: variable 'omit' from source: magic vars 28173 1726882763.75923: starting attempt loop 28173 1726882763.75926: running the handler 28173 1726882763.76037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882763.76323: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882763.76353: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882763.76411: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882763.76438: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882763.76496: variable 'route_table_30400' from source: set_fact 28173 1726882763.76522: Evaluated conditional (route_table_30400.stdout is search("198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2")): True 28173 1726882763.76526: handler run complete 28173 1726882763.76537: attempt loop complete, returning result 28173 1726882763.76540: _execute() done 28173 1726882763.76543: dumping result to json 28173 1726882763.76545: done dumping result, returning 28173 1726882763.76551: done running TaskExecutor() for managed_node2/TASK: Assert that the route table 30400 contains the specified route [0e448fcc-3ce9-926c-8928-00000000005e] 28173 1726882763.76556: sending task result for task 0e448fcc-3ce9-926c-8928-00000000005e 28173 1726882763.76641: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000005e 28173 1726882763.76643: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28173 1726882763.76691: no more pending results, returning what we have 28173 1726882763.76694: results queue empty 28173 1726882763.76695: checking for any_errors_fatal 28173 1726882763.76702: done checking for any_errors_fatal 28173 1726882763.76702: checking for max_fail_percentage 28173 1726882763.76704: done checking for max_fail_percentage 28173 1726882763.76705: checking to see if all hosts have failed and the running result is not ok 28173 1726882763.76706: done checking to see if all hosts have failed 28173 1726882763.76706: getting the remaining hosts for this loop 28173 1726882763.76708: done getting the remaining hosts for this loop 28173 1726882763.76711: getting the next task for host managed_node2 28173 1726882763.76715: done getting next task for host managed_node2 28173 1726882763.76721: ^ task is: TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 28173 1726882763.76723: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882763.76726: getting variables 28173 1726882763.76727: in VariableManager get_vars() 28173 1726882763.76768: Calling all_inventory to load vars for managed_node2 28173 1726882763.76771: Calling groups_inventory to load vars for managed_node2 28173 1726882763.76773: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882763.76782: Calling all_plugins_play to load vars for managed_node2 28173 1726882763.76784: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882763.76785: Calling groups_plugins_play to load vars for managed_node2 28173 1726882763.77616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882763.79473: done with get_vars() 28173 1726882763.79498: done getting variables TASK [Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:82 Friday 20 September 2024 21:39:23 -0400 (0:00:00.049) 0:00:16.960 ****** 28173 1726882763.79560: entering _queue_task() for managed_node2/lineinfile 28173 1726882763.79561: Creating lock for lineinfile 28173 1726882763.79776: worker is 1 (out of 1 available) 28173 1726882763.79788: exiting _queue_task() for managed_node2/lineinfile 28173 1726882763.79801: done queuing things up, now waiting for results queue to drain 28173 1726882763.79803: waiting for pending results... 28173 1726882763.79982: running TaskExecutor() for managed_node2/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 28173 1726882763.80034: in run() - task 0e448fcc-3ce9-926c-8928-00000000005f 28173 1726882763.80046: variable 'ansible_search_path' from source: unknown 28173 1726882763.80079: calling self._execute() 28173 1726882763.80148: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882763.80153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882763.80161: variable 'omit' from source: magic vars 28173 1726882763.80422: variable 'ansible_distribution_major_version' from source: facts 28173 1726882763.80432: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882763.80438: variable 'omit' from source: magic vars 28173 1726882763.80454: variable 'omit' from source: magic vars 28173 1726882763.80484: variable 'omit' from source: magic vars 28173 1726882763.80516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882763.80541: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882763.80555: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882763.80572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882763.80583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882763.80605: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882763.80611: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882763.80614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882763.80687: Set connection var ansible_pipelining to False 28173 1726882763.80690: Set connection var ansible_shell_type to sh 28173 1726882763.80696: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882763.80703: Set connection var ansible_timeout to 10 28173 1726882763.80707: Set connection var ansible_shell_executable to /bin/sh 28173 1726882763.80716: Set connection var ansible_connection to ssh 28173 1726882763.80732: variable 'ansible_shell_executable' from source: unknown 28173 1726882763.80735: variable 'ansible_connection' from source: unknown 28173 1726882763.80742: variable 'ansible_module_compression' from source: unknown 28173 1726882763.80745: variable 'ansible_shell_type' from source: unknown 28173 1726882763.80748: variable 'ansible_shell_executable' from source: unknown 28173 1726882763.80750: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882763.80752: variable 'ansible_pipelining' from source: unknown 28173 1726882763.80754: variable 'ansible_timeout' from source: unknown 28173 1726882763.80759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882763.80905: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882763.80914: variable 'omit' from source: magic vars 28173 1726882763.80919: starting attempt loop 28173 1726882763.80922: running the handler 28173 1726882763.80934: _low_level_execute_command(): starting 28173 1726882763.80940: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882763.81741: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.81744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.81748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.81751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882763.81753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882763.81755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882763.81839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882763.83540: stdout chunk (state=3): >>>/root <<< 28173 1726882763.83646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882763.83720: stderr chunk (state=3): >>><<< 28173 1726882763.83729: stdout chunk (state=3): >>><<< 28173 1726882763.83760: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882763.83780: _low_level_execute_command(): starting 28173 1726882763.83790: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882763.8376744-28959-129435135608757 `" && echo ansible-tmp-1726882763.8376744-28959-129435135608757="` echo /root/.ansible/tmp/ansible-tmp-1726882763.8376744-28959-129435135608757 `" ) && sleep 0' 28173 1726882763.84417: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882763.84430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.84443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.84459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.84503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882763.84519: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882763.84538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.84554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882763.84567: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882763.84578: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882763.84589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882763.84602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.84621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.84633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882763.84643: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882763.84655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.84737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882763.84757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882763.84774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882763.84914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882763.86836: stdout chunk (state=3): >>>ansible-tmp-1726882763.8376744-28959-129435135608757=/root/.ansible/tmp/ansible-tmp-1726882763.8376744-28959-129435135608757 <<< 28173 1726882763.87018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882763.87021: stdout chunk (state=3): >>><<< 28173 1726882763.87024: stderr chunk (state=3): >>><<< 28173 1726882763.87072: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882763.8376744-28959-129435135608757=/root/.ansible/tmp/ansible-tmp-1726882763.8376744-28959-129435135608757 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882763.87275: variable 'ansible_module_compression' from source: unknown 28173 1726882763.87278: ANSIBALLZ: Using lock for lineinfile 28173 1726882763.87280: ANSIBALLZ: Acquiring lock 28173 1726882763.87282: ANSIBALLZ: Lock acquired: 140243972760752 28173 1726882763.87284: ANSIBALLZ: Creating module 28173 1726882763.97344: ANSIBALLZ: Writing module into payload 28173 1726882763.97513: ANSIBALLZ: Writing module 28173 1726882763.97538: ANSIBALLZ: Renaming module 28173 1726882763.97548: ANSIBALLZ: Done creating module 28173 1726882763.97576: variable 'ansible_facts' from source: unknown 28173 1726882763.97661: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882763.8376744-28959-129435135608757/AnsiballZ_lineinfile.py 28173 1726882763.97831: Sending initial data 28173 1726882763.97834: Sent initial data (159 bytes) 28173 1726882763.98835: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.98841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882763.98893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882763.98896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28173 1726882763.98899: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882763.98901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882763.98961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882763.98988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882763.99003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882763.99137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882764.00975: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882764.01058: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882764.01156: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpi2uaw83u /root/.ansible/tmp/ansible-tmp-1726882763.8376744-28959-129435135608757/AnsiballZ_lineinfile.py <<< 28173 1726882764.01252: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882764.02422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882764.02537: stderr chunk (state=3): >>><<< 28173 1726882764.02540: stdout chunk (state=3): >>><<< 28173 1726882764.02542: done transferring module to remote 28173 1726882764.02544: _low_level_execute_command(): starting 28173 1726882764.02546: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882763.8376744-28959-129435135608757/ /root/.ansible/tmp/ansible-tmp-1726882763.8376744-28959-129435135608757/AnsiballZ_lineinfile.py && sleep 0' 28173 1726882764.03188: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882764.03191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882764.03220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.03226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882764.03229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.03288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882764.03291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882764.03392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882764.05183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882764.05221: stderr chunk (state=3): >>><<< 28173 1726882764.05224: stdout chunk (state=3): >>><<< 28173 1726882764.05237: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882764.05240: _low_level_execute_command(): starting 28173 1726882764.05245: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882763.8376744-28959-129435135608757/AnsiballZ_lineinfile.py && sleep 0' 28173 1726882764.05657: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882764.05660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882764.05694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.05697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882764.05699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.05750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882764.05753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882764.05862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882764.19640: stdout chunk (state=3): >>> {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 28173 1726882764.20628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882764.20688: stderr chunk (state=3): >>><<< 28173 1726882764.20692: stdout chunk (state=3): >>><<< 28173 1726882764.20711: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882764.20738: done with _execute_module (lineinfile, {'path': '/etc/iproute2/rt_tables.d/table.conf', 'line': '200 custom', 'mode': '0644', 'create': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'lineinfile', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882763.8376744-28959-129435135608757/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882764.20747: _low_level_execute_command(): starting 28173 1726882764.20751: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882763.8376744-28959-129435135608757/ > /dev/null 2>&1 && sleep 0' 28173 1726882764.21206: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882764.21218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882764.21237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882764.21253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.21301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882764.21313: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882764.21420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882764.23299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882764.23341: stderr chunk (state=3): >>><<< 28173 1726882764.23344: stdout chunk (state=3): >>><<< 28173 1726882764.23356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882764.23361: handler run complete 28173 1726882764.23389: attempt loop complete, returning result 28173 1726882764.23392: _execute() done 28173 1726882764.23394: dumping result to json 28173 1726882764.23398: done dumping result, returning 28173 1726882764.23405: done running TaskExecutor() for managed_node2/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table [0e448fcc-3ce9-926c-8928-00000000005f] 28173 1726882764.23410: sending task result for task 0e448fcc-3ce9-926c-8928-00000000005f 28173 1726882764.23508: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000005f 28173 1726882764.23511: WORKER PROCESS EXITING changed: [managed_node2] => { "backup": "", "changed": true } MSG: line added 28173 1726882764.23580: no more pending results, returning what we have 28173 1726882764.23582: results queue empty 28173 1726882764.23583: checking for any_errors_fatal 28173 1726882764.23589: done checking for any_errors_fatal 28173 1726882764.23590: checking for max_fail_percentage 28173 1726882764.23591: done checking for max_fail_percentage 28173 1726882764.23592: checking to see if all hosts have failed and the running result is not ok 28173 1726882764.23593: done checking to see if all hosts have failed 28173 1726882764.23594: getting the remaining hosts for this loop 28173 1726882764.23595: done getting the remaining hosts for this loop 28173 1726882764.23598: getting the next task for host managed_node2 28173 1726882764.23605: done getting next task for host managed_node2 28173 1726882764.23610: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28173 1726882764.23613: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882764.23637: getting variables 28173 1726882764.23640: in VariableManager get_vars() 28173 1726882764.23684: Calling all_inventory to load vars for managed_node2 28173 1726882764.23686: Calling groups_inventory to load vars for managed_node2 28173 1726882764.23688: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882764.23698: Calling all_plugins_play to load vars for managed_node2 28173 1726882764.23700: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882764.23703: Calling groups_plugins_play to load vars for managed_node2 28173 1726882764.24550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882764.25512: done with get_vars() 28173 1726882764.25527: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:39:24 -0400 (0:00:00.460) 0:00:17.420 ****** 28173 1726882764.25600: entering _queue_task() for managed_node2/include_tasks 28173 1726882764.25809: worker is 1 (out of 1 available) 28173 1726882764.25822: exiting _queue_task() for managed_node2/include_tasks 28173 1726882764.25834: done queuing things up, now waiting for results queue to drain 28173 1726882764.25835: waiting for pending results... 28173 1726882764.26017: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28173 1726882764.26122: in run() - task 0e448fcc-3ce9-926c-8928-000000000067 28173 1726882764.26139: variable 'ansible_search_path' from source: unknown 28173 1726882764.26142: variable 'ansible_search_path' from source: unknown 28173 1726882764.26171: calling self._execute() 28173 1726882764.26248: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882764.26252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882764.26260: variable 'omit' from source: magic vars 28173 1726882764.26534: variable 'ansible_distribution_major_version' from source: facts 28173 1726882764.26545: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882764.26550: _execute() done 28173 1726882764.26557: dumping result to json 28173 1726882764.26567: done dumping result, returning 28173 1726882764.26571: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-926c-8928-000000000067] 28173 1726882764.26579: sending task result for task 0e448fcc-3ce9-926c-8928-000000000067 28173 1726882764.26659: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000067 28173 1726882764.26662: WORKER PROCESS EXITING 28173 1726882764.26706: no more pending results, returning what we have 28173 1726882764.26711: in VariableManager get_vars() 28173 1726882764.26752: Calling all_inventory to load vars for managed_node2 28173 1726882764.26754: Calling groups_inventory to load vars for managed_node2 28173 1726882764.26756: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882764.26770: Calling all_plugins_play to load vars for managed_node2 28173 1726882764.26773: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882764.26776: Calling groups_plugins_play to load vars for managed_node2 28173 1726882764.27693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882764.28621: done with get_vars() 28173 1726882764.28634: variable 'ansible_search_path' from source: unknown 28173 1726882764.28635: variable 'ansible_search_path' from source: unknown 28173 1726882764.28660: we have included files to process 28173 1726882764.28661: generating all_blocks data 28173 1726882764.28665: done generating all_blocks data 28173 1726882764.28671: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28173 1726882764.28673: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28173 1726882764.28674: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28173 1726882764.29046: done processing included file 28173 1726882764.29047: iterating over new_blocks loaded from include file 28173 1726882764.29048: in VariableManager get_vars() 28173 1726882764.29067: done with get_vars() 28173 1726882764.29068: filtering new block on tags 28173 1726882764.29081: done filtering new block on tags 28173 1726882764.29082: in VariableManager get_vars() 28173 1726882764.29096: done with get_vars() 28173 1726882764.29097: filtering new block on tags 28173 1726882764.29108: done filtering new block on tags 28173 1726882764.29110: in VariableManager get_vars() 28173 1726882764.29123: done with get_vars() 28173 1726882764.29124: filtering new block on tags 28173 1726882764.29133: done filtering new block on tags 28173 1726882764.29134: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28173 1726882764.29138: extending task lists for all hosts with included blocks 28173 1726882764.29619: done extending task lists 28173 1726882764.29620: done processing included files 28173 1726882764.29620: results queue empty 28173 1726882764.29621: checking for any_errors_fatal 28173 1726882764.29624: done checking for any_errors_fatal 28173 1726882764.29624: checking for max_fail_percentage 28173 1726882764.29625: done checking for max_fail_percentage 28173 1726882764.29625: checking to see if all hosts have failed and the running result is not ok 28173 1726882764.29626: done checking to see if all hosts have failed 28173 1726882764.29626: getting the remaining hosts for this loop 28173 1726882764.29627: done getting the remaining hosts for this loop 28173 1726882764.29629: getting the next task for host managed_node2 28173 1726882764.29631: done getting next task for host managed_node2 28173 1726882764.29633: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28173 1726882764.29635: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882764.29641: getting variables 28173 1726882764.29642: in VariableManager get_vars() 28173 1726882764.29651: Calling all_inventory to load vars for managed_node2 28173 1726882764.29653: Calling groups_inventory to load vars for managed_node2 28173 1726882764.29654: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882764.29657: Calling all_plugins_play to load vars for managed_node2 28173 1726882764.29659: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882764.29661: Calling groups_plugins_play to load vars for managed_node2 28173 1726882764.30346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882764.31263: done with get_vars() 28173 1726882764.31280: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:39:24 -0400 (0:00:00.057) 0:00:17.477 ****** 28173 1726882764.31328: entering _queue_task() for managed_node2/setup 28173 1726882764.31516: worker is 1 (out of 1 available) 28173 1726882764.31527: exiting _queue_task() for managed_node2/setup 28173 1726882764.31539: done queuing things up, now waiting for results queue to drain 28173 1726882764.31540: waiting for pending results... 28173 1726882764.31710: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28173 1726882764.31816: in run() - task 0e448fcc-3ce9-926c-8928-0000000005df 28173 1726882764.31827: variable 'ansible_search_path' from source: unknown 28173 1726882764.31831: variable 'ansible_search_path' from source: unknown 28173 1726882764.31857: calling self._execute() 28173 1726882764.31930: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882764.31933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882764.31941: variable 'omit' from source: magic vars 28173 1726882764.32209: variable 'ansible_distribution_major_version' from source: facts 28173 1726882764.32218: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882764.32362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882764.33948: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882764.34003: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882764.34032: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882764.34059: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882764.34084: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882764.34138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882764.34161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882764.34184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882764.34211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882764.34222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882764.34256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882764.34283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882764.34301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882764.34326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882764.34336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882764.34445: variable '__network_required_facts' from source: role '' defaults 28173 1726882764.34452: variable 'ansible_facts' from source: unknown 28173 1726882764.34901: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28173 1726882764.34905: when evaluation is False, skipping this task 28173 1726882764.34907: _execute() done 28173 1726882764.34910: dumping result to json 28173 1726882764.34912: done dumping result, returning 28173 1726882764.34914: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-926c-8928-0000000005df] 28173 1726882764.34920: sending task result for task 0e448fcc-3ce9-926c-8928-0000000005df 28173 1726882764.35007: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000005df 28173 1726882764.35010: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882764.35071: no more pending results, returning what we have 28173 1726882764.35076: results queue empty 28173 1726882764.35076: checking for any_errors_fatal 28173 1726882764.35078: done checking for any_errors_fatal 28173 1726882764.35078: checking for max_fail_percentage 28173 1726882764.35080: done checking for max_fail_percentage 28173 1726882764.35081: checking to see if all hosts have failed and the running result is not ok 28173 1726882764.35082: done checking to see if all hosts have failed 28173 1726882764.35082: getting the remaining hosts for this loop 28173 1726882764.35084: done getting the remaining hosts for this loop 28173 1726882764.35087: getting the next task for host managed_node2 28173 1726882764.35094: done getting next task for host managed_node2 28173 1726882764.35098: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28173 1726882764.35102: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882764.35119: getting variables 28173 1726882764.35120: in VariableManager get_vars() 28173 1726882764.35164: Calling all_inventory to load vars for managed_node2 28173 1726882764.35167: Calling groups_inventory to load vars for managed_node2 28173 1726882764.35169: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882764.35179: Calling all_plugins_play to load vars for managed_node2 28173 1726882764.35182: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882764.35184: Calling groups_plugins_play to load vars for managed_node2 28173 1726882764.36039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882764.36992: done with get_vars() 28173 1726882764.37007: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:39:24 -0400 (0:00:00.057) 0:00:17.535 ****** 28173 1726882764.37081: entering _queue_task() for managed_node2/stat 28173 1726882764.37278: worker is 1 (out of 1 available) 28173 1726882764.37289: exiting _queue_task() for managed_node2/stat 28173 1726882764.37301: done queuing things up, now waiting for results queue to drain 28173 1726882764.37303: waiting for pending results... 28173 1726882764.37486: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28173 1726882764.37582: in run() - task 0e448fcc-3ce9-926c-8928-0000000005e1 28173 1726882764.37592: variable 'ansible_search_path' from source: unknown 28173 1726882764.37595: variable 'ansible_search_path' from source: unknown 28173 1726882764.37626: calling self._execute() 28173 1726882764.37694: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882764.37698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882764.37706: variable 'omit' from source: magic vars 28173 1726882764.37975: variable 'ansible_distribution_major_version' from source: facts 28173 1726882764.37985: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882764.38102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882764.38290: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882764.38322: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882764.38345: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882764.38372: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882764.38433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882764.38450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882764.38469: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882764.38488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882764.38549: variable '__network_is_ostree' from source: set_fact 28173 1726882764.38555: Evaluated conditional (not __network_is_ostree is defined): False 28173 1726882764.38558: when evaluation is False, skipping this task 28173 1726882764.38561: _execute() done 28173 1726882764.38563: dumping result to json 28173 1726882764.38567: done dumping result, returning 28173 1726882764.38575: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-926c-8928-0000000005e1] 28173 1726882764.38581: sending task result for task 0e448fcc-3ce9-926c-8928-0000000005e1 28173 1726882764.38659: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000005e1 28173 1726882764.38662: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28173 1726882764.38715: no more pending results, returning what we have 28173 1726882764.38718: results queue empty 28173 1726882764.38718: checking for any_errors_fatal 28173 1726882764.38724: done checking for any_errors_fatal 28173 1726882764.38725: checking for max_fail_percentage 28173 1726882764.38726: done checking for max_fail_percentage 28173 1726882764.38727: checking to see if all hosts have failed and the running result is not ok 28173 1726882764.38728: done checking to see if all hosts have failed 28173 1726882764.38729: getting the remaining hosts for this loop 28173 1726882764.38730: done getting the remaining hosts for this loop 28173 1726882764.38734: getting the next task for host managed_node2 28173 1726882764.38739: done getting next task for host managed_node2 28173 1726882764.38742: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28173 1726882764.38745: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882764.38760: getting variables 28173 1726882764.38761: in VariableManager get_vars() 28173 1726882764.38802: Calling all_inventory to load vars for managed_node2 28173 1726882764.38805: Calling groups_inventory to load vars for managed_node2 28173 1726882764.38807: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882764.38815: Calling all_plugins_play to load vars for managed_node2 28173 1726882764.38818: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882764.38820: Calling groups_plugins_play to load vars for managed_node2 28173 1726882764.39666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882764.40600: done with get_vars() 28173 1726882764.40615: done getting variables 28173 1726882764.40654: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:39:24 -0400 (0:00:00.036) 0:00:17.571 ****** 28173 1726882764.40682: entering _queue_task() for managed_node2/set_fact 28173 1726882764.40866: worker is 1 (out of 1 available) 28173 1726882764.40878: exiting _queue_task() for managed_node2/set_fact 28173 1726882764.40891: done queuing things up, now waiting for results queue to drain 28173 1726882764.40892: waiting for pending results... 28173 1726882764.41061: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28173 1726882764.41167: in run() - task 0e448fcc-3ce9-926c-8928-0000000005e2 28173 1726882764.41187: variable 'ansible_search_path' from source: unknown 28173 1726882764.41191: variable 'ansible_search_path' from source: unknown 28173 1726882764.41216: calling self._execute() 28173 1726882764.41288: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882764.41295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882764.41304: variable 'omit' from source: magic vars 28173 1726882764.41565: variable 'ansible_distribution_major_version' from source: facts 28173 1726882764.41578: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882764.41691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882764.41876: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882764.41907: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882764.41932: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882764.41956: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882764.42018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882764.42039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882764.42058: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882764.42080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882764.42139: variable '__network_is_ostree' from source: set_fact 28173 1726882764.42148: Evaluated conditional (not __network_is_ostree is defined): False 28173 1726882764.42151: when evaluation is False, skipping this task 28173 1726882764.42157: _execute() done 28173 1726882764.42160: dumping result to json 28173 1726882764.42162: done dumping result, returning 28173 1726882764.42174: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-926c-8928-0000000005e2] 28173 1726882764.42179: sending task result for task 0e448fcc-3ce9-926c-8928-0000000005e2 28173 1726882764.42255: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000005e2 28173 1726882764.42259: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28173 1726882764.42308: no more pending results, returning what we have 28173 1726882764.42312: results queue empty 28173 1726882764.42312: checking for any_errors_fatal 28173 1726882764.42318: done checking for any_errors_fatal 28173 1726882764.42319: checking for max_fail_percentage 28173 1726882764.42320: done checking for max_fail_percentage 28173 1726882764.42321: checking to see if all hosts have failed and the running result is not ok 28173 1726882764.42322: done checking to see if all hosts have failed 28173 1726882764.42322: getting the remaining hosts for this loop 28173 1726882764.42324: done getting the remaining hosts for this loop 28173 1726882764.42327: getting the next task for host managed_node2 28173 1726882764.42335: done getting next task for host managed_node2 28173 1726882764.42338: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28173 1726882764.42342: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882764.42355: getting variables 28173 1726882764.42357: in VariableManager get_vars() 28173 1726882764.42398: Calling all_inventory to load vars for managed_node2 28173 1726882764.42401: Calling groups_inventory to load vars for managed_node2 28173 1726882764.42403: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882764.42411: Calling all_plugins_play to load vars for managed_node2 28173 1726882764.42413: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882764.42415: Calling groups_plugins_play to load vars for managed_node2 28173 1726882764.43186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882764.44125: done with get_vars() 28173 1726882764.44139: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:39:24 -0400 (0:00:00.035) 0:00:17.606 ****** 28173 1726882764.44204: entering _queue_task() for managed_node2/service_facts 28173 1726882764.44391: worker is 1 (out of 1 available) 28173 1726882764.44404: exiting _queue_task() for managed_node2/service_facts 28173 1726882764.44417: done queuing things up, now waiting for results queue to drain 28173 1726882764.44419: waiting for pending results... 28173 1726882764.44587: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28173 1726882764.44682: in run() - task 0e448fcc-3ce9-926c-8928-0000000005e4 28173 1726882764.44694: variable 'ansible_search_path' from source: unknown 28173 1726882764.44697: variable 'ansible_search_path' from source: unknown 28173 1726882764.44724: calling self._execute() 28173 1726882764.44794: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882764.44798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882764.44806: variable 'omit' from source: magic vars 28173 1726882764.45056: variable 'ansible_distribution_major_version' from source: facts 28173 1726882764.45067: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882764.45077: variable 'omit' from source: magic vars 28173 1726882764.45127: variable 'omit' from source: magic vars 28173 1726882764.45149: variable 'omit' from source: magic vars 28173 1726882764.45184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882764.45212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882764.45226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882764.45239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882764.45248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882764.45274: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882764.45277: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882764.45280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882764.45347: Set connection var ansible_pipelining to False 28173 1726882764.45350: Set connection var ansible_shell_type to sh 28173 1726882764.45356: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882764.45363: Set connection var ansible_timeout to 10 28173 1726882764.45371: Set connection var ansible_shell_executable to /bin/sh 28173 1726882764.45376: Set connection var ansible_connection to ssh 28173 1726882764.45392: variable 'ansible_shell_executable' from source: unknown 28173 1726882764.45395: variable 'ansible_connection' from source: unknown 28173 1726882764.45398: variable 'ansible_module_compression' from source: unknown 28173 1726882764.45400: variable 'ansible_shell_type' from source: unknown 28173 1726882764.45402: variable 'ansible_shell_executable' from source: unknown 28173 1726882764.45406: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882764.45408: variable 'ansible_pipelining' from source: unknown 28173 1726882764.45410: variable 'ansible_timeout' from source: unknown 28173 1726882764.45414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882764.45552: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882764.45560: variable 'omit' from source: magic vars 28173 1726882764.45566: starting attempt loop 28173 1726882764.45571: running the handler 28173 1726882764.45582: _low_level_execute_command(): starting 28173 1726882764.45588: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882764.46099: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882764.46115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882764.46129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882764.46141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882764.46151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.46199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882764.46211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882764.46333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882764.48007: stdout chunk (state=3): >>>/root <<< 28173 1726882764.48103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882764.48154: stderr chunk (state=3): >>><<< 28173 1726882764.48157: stdout chunk (state=3): >>><<< 28173 1726882764.48179: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882764.48189: _low_level_execute_command(): starting 28173 1726882764.48194: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882764.481779-28978-201897466055912 `" && echo ansible-tmp-1726882764.481779-28978-201897466055912="` echo /root/.ansible/tmp/ansible-tmp-1726882764.481779-28978-201897466055912 `" ) && sleep 0' 28173 1726882764.48617: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882764.48630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882764.48648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882764.48673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.48714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882764.48726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882764.48833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882764.50730: stdout chunk (state=3): >>>ansible-tmp-1726882764.481779-28978-201897466055912=/root/.ansible/tmp/ansible-tmp-1726882764.481779-28978-201897466055912 <<< 28173 1726882764.50843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882764.50913: stderr chunk (state=3): >>><<< 28173 1726882764.50920: stdout chunk (state=3): >>><<< 28173 1726882764.50938: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882764.481779-28978-201897466055912=/root/.ansible/tmp/ansible-tmp-1726882764.481779-28978-201897466055912 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882764.50994: variable 'ansible_module_compression' from source: unknown 28173 1726882764.51043: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28173 1726882764.51097: variable 'ansible_facts' from source: unknown 28173 1726882764.51189: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882764.481779-28978-201897466055912/AnsiballZ_service_facts.py 28173 1726882764.51336: Sending initial data 28173 1726882764.51339: Sent initial data (161 bytes) 28173 1726882764.52376: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882764.52390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882764.52403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882764.52420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882764.52474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882764.52488: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882764.52502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.52520: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882764.52535: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882764.52551: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882764.52569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882764.52592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882764.52609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882764.52621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882764.52631: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882764.52643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.52734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882764.52750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882764.52772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882764.52903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882764.54706: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 28173 1726882764.54710: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882764.54798: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882764.54896: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpo_uuemj_ /root/.ansible/tmp/ansible-tmp-1726882764.481779-28978-201897466055912/AnsiballZ_service_facts.py <<< 28173 1726882764.54991: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882764.56122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882764.56289: stderr chunk (state=3): >>><<< 28173 1726882764.56292: stdout chunk (state=3): >>><<< 28173 1726882764.56294: done transferring module to remote 28173 1726882764.56296: _low_level_execute_command(): starting 28173 1726882764.56299: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882764.481779-28978-201897466055912/ /root/.ansible/tmp/ansible-tmp-1726882764.481779-28978-201897466055912/AnsiballZ_service_facts.py && sleep 0' 28173 1726882764.56868: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882764.56882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882764.56895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882764.56912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882764.56954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882764.56975: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882764.56989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.57006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882764.57024: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882764.57036: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882764.57048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882764.57060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882764.57085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882764.57100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882764.57110: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882764.57122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.57206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882764.57225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882764.57239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882764.57366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882764.59174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882764.59244: stderr chunk (state=3): >>><<< 28173 1726882764.59254: stdout chunk (state=3): >>><<< 28173 1726882764.59348: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882764.59351: _low_level_execute_command(): starting 28173 1726882764.59353: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882764.481779-28978-201897466055912/AnsiballZ_service_facts.py && sleep 0' 28173 1726882764.59917: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882764.59929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882764.59942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882764.59957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882764.60002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882764.60018: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882764.60031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.60047: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882764.60057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882764.60070: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882764.60082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882764.60098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882764.60113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882764.60129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882764.60140: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882764.60152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882764.60234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882764.60257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882764.60276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882764.60408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882765.95114: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "s<<< 28173 1726882765.95142: stdout chunk (state=3): >>>tatic", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28173 1726882765.96438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882765.96442: stdout chunk (state=3): >>><<< 28173 1726882765.96449: stderr chunk (state=3): >>><<< 28173 1726882765.96476: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882765.97732: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882764.481779-28978-201897466055912/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882765.97856: _low_level_execute_command(): starting 28173 1726882765.97860: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882764.481779-28978-201897466055912/ > /dev/null 2>&1 && sleep 0' 28173 1726882765.99241: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882765.99244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882765.99357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882765.99361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882765.99363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882765.99434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882765.99547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882765.99760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882766.01553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882766.01668: stderr chunk (state=3): >>><<< 28173 1726882766.01671: stdout chunk (state=3): >>><<< 28173 1726882766.01971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882766.01974: handler run complete 28173 1726882766.01976: variable 'ansible_facts' from source: unknown 28173 1726882766.02039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882766.03494: variable 'ansible_facts' from source: unknown 28173 1726882766.03728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882766.04108: attempt loop complete, returning result 28173 1726882766.04229: _execute() done 28173 1726882766.04236: dumping result to json 28173 1726882766.04299: done dumping result, returning 28173 1726882766.04312: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-926c-8928-0000000005e4] 28173 1726882766.04322: sending task result for task 0e448fcc-3ce9-926c-8928-0000000005e4 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882766.05798: no more pending results, returning what we have 28173 1726882766.05801: results queue empty 28173 1726882766.05802: checking for any_errors_fatal 28173 1726882766.05807: done checking for any_errors_fatal 28173 1726882766.05808: checking for max_fail_percentage 28173 1726882766.05810: done checking for max_fail_percentage 28173 1726882766.05811: checking to see if all hosts have failed and the running result is not ok 28173 1726882766.05812: done checking to see if all hosts have failed 28173 1726882766.05813: getting the remaining hosts for this loop 28173 1726882766.05814: done getting the remaining hosts for this loop 28173 1726882766.05818: getting the next task for host managed_node2 28173 1726882766.05825: done getting next task for host managed_node2 28173 1726882766.05828: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28173 1726882766.05833: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882766.05844: getting variables 28173 1726882766.05846: in VariableManager get_vars() 28173 1726882766.05884: Calling all_inventory to load vars for managed_node2 28173 1726882766.05887: Calling groups_inventory to load vars for managed_node2 28173 1726882766.05890: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882766.05901: Calling all_plugins_play to load vars for managed_node2 28173 1726882766.05904: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882766.05908: Calling groups_plugins_play to load vars for managed_node2 28173 1726882766.07481: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000005e4 28173 1726882766.07485: WORKER PROCESS EXITING 28173 1726882766.08055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882766.11961: done with get_vars() 28173 1726882766.11987: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:39:26 -0400 (0:00:01.678) 0:00:19.285 ****** 28173 1726882766.12086: entering _queue_task() for managed_node2/package_facts 28173 1726882766.13198: worker is 1 (out of 1 available) 28173 1726882766.13212: exiting _queue_task() for managed_node2/package_facts 28173 1726882766.13225: done queuing things up, now waiting for results queue to drain 28173 1726882766.13226: waiting for pending results... 28173 1726882766.14153: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28173 1726882766.14462: in run() - task 0e448fcc-3ce9-926c-8928-0000000005e5 28173 1726882766.14610: variable 'ansible_search_path' from source: unknown 28173 1726882766.14620: variable 'ansible_search_path' from source: unknown 28173 1726882766.14667: calling self._execute() 28173 1726882766.14906: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882766.14923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882766.14994: variable 'omit' from source: magic vars 28173 1726882766.15717: variable 'ansible_distribution_major_version' from source: facts 28173 1726882766.15805: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882766.15816: variable 'omit' from source: magic vars 28173 1726882766.15896: variable 'omit' from source: magic vars 28173 1726882766.16048: variable 'omit' from source: magic vars 28173 1726882766.16095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882766.16268: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882766.16295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882766.16318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882766.16339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882766.16375: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882766.16452: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882766.16460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882766.16684: Set connection var ansible_pipelining to False 28173 1726882766.16692: Set connection var ansible_shell_type to sh 28173 1726882766.16705: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882766.16718: Set connection var ansible_timeout to 10 28173 1726882766.16781: Set connection var ansible_shell_executable to /bin/sh 28173 1726882766.16792: Set connection var ansible_connection to ssh 28173 1726882766.16817: variable 'ansible_shell_executable' from source: unknown 28173 1726882766.16828: variable 'ansible_connection' from source: unknown 28173 1726882766.16886: variable 'ansible_module_compression' from source: unknown 28173 1726882766.16892: variable 'ansible_shell_type' from source: unknown 28173 1726882766.16897: variable 'ansible_shell_executable' from source: unknown 28173 1726882766.16902: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882766.16907: variable 'ansible_pipelining' from source: unknown 28173 1726882766.16912: variable 'ansible_timeout' from source: unknown 28173 1726882766.16918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882766.17272: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882766.17380: variable 'omit' from source: magic vars 28173 1726882766.17389: starting attempt loop 28173 1726882766.17396: running the handler 28173 1726882766.17412: _low_level_execute_command(): starting 28173 1726882766.17478: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882766.19401: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882766.19405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882766.19430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882766.19433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882766.19553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882766.19557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882766.19688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882766.19762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882766.19772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882766.19891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882766.21585: stdout chunk (state=3): >>>/root <<< 28173 1726882766.21686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882766.21755: stderr chunk (state=3): >>><<< 28173 1726882766.21758: stdout chunk (state=3): >>><<< 28173 1726882766.21883: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882766.21887: _low_level_execute_command(): starting 28173 1726882766.21890: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882766.2178338-29034-21965005863203 `" && echo ansible-tmp-1726882766.2178338-29034-21965005863203="` echo /root/.ansible/tmp/ansible-tmp-1726882766.2178338-29034-21965005863203 `" ) && sleep 0' 28173 1726882766.23213: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882766.23217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882766.23251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882766.23262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882766.23267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882766.23440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882766.23444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882766.23561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882766.25499: stdout chunk (state=3): >>>ansible-tmp-1726882766.2178338-29034-21965005863203=/root/.ansible/tmp/ansible-tmp-1726882766.2178338-29034-21965005863203 <<< 28173 1726882766.25612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882766.25675: stderr chunk (state=3): >>><<< 28173 1726882766.25679: stdout chunk (state=3): >>><<< 28173 1726882766.25973: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882766.2178338-29034-21965005863203=/root/.ansible/tmp/ansible-tmp-1726882766.2178338-29034-21965005863203 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882766.25978: variable 'ansible_module_compression' from source: unknown 28173 1726882766.25980: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28173 1726882766.25982: variable 'ansible_facts' from source: unknown 28173 1726882766.26074: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882766.2178338-29034-21965005863203/AnsiballZ_package_facts.py 28173 1726882766.26521: Sending initial data 28173 1726882766.26524: Sent initial data (161 bytes) 28173 1726882766.28420: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882766.28444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882766.28460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882766.28484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882766.28526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882766.28554: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882766.28585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882766.28611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882766.28625: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882766.28637: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882766.28655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882766.28676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882766.28693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882766.28714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882766.28732: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882766.28746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882766.28829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882766.28847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882766.28861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882766.29004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882766.30784: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882766.30880: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882766.30983: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpu4c7oc8e /root/.ansible/tmp/ansible-tmp-1726882766.2178338-29034-21965005863203/AnsiballZ_package_facts.py <<< 28173 1726882766.31079: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882766.34278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882766.34546: stderr chunk (state=3): >>><<< 28173 1726882766.34549: stdout chunk (state=3): >>><<< 28173 1726882766.34551: done transferring module to remote 28173 1726882766.34553: _low_level_execute_command(): starting 28173 1726882766.34556: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882766.2178338-29034-21965005863203/ /root/.ansible/tmp/ansible-tmp-1726882766.2178338-29034-21965005863203/AnsiballZ_package_facts.py && sleep 0' 28173 1726882766.35156: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882766.35176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882766.35190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882766.35206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882766.35251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882766.35262: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882766.35280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882766.35296: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882766.35311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882766.35327: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882766.35338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882766.35351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882766.35368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882766.35381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882766.35390: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882766.35402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882766.35488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882766.35508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882766.35522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882766.35649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882766.37507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882766.37511: stdout chunk (state=3): >>><<< 28173 1726882766.37513: stderr chunk (state=3): >>><<< 28173 1726882766.37626: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882766.37629: _low_level_execute_command(): starting 28173 1726882766.37632: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882766.2178338-29034-21965005863203/AnsiballZ_package_facts.py && sleep 0' 28173 1726882766.38251: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882766.38269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882766.38290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882766.38316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882766.38357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882766.38375: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882766.38390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882766.38419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882766.38432: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882766.38444: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882766.38456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882766.38476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882766.38493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882766.38510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882766.38527: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882766.38543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882766.38672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882766.38695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882766.38712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882766.38851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882766.85079: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"<<< 28173 1726882766.85088: stdout chunk (state=3): >>>}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x8<<< 28173 1726882766.85106: stdout chunk (state=3): >>>6_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [<<< 28173 1726882766.85111: stdout chunk (state=3): >>>{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba<<< 28173 1726882766.85117: stdout chunk (state=3): >>>", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "py<<< 28173 1726882766.85193: stdout chunk (state=3): >>>thon3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source":<<< 28173 1726882766.85207: stdout chunk (state=3): >>> "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rp<<< 28173 1726882766.85213: stdout chunk (state=3): >>>m"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1"<<< 28173 1726882766.85217: stdout chunk (state=3): >>>, "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "pe<<< 28173 1726882766.85240: stdout chunk (state=3): >>>rl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch"<<< 28173 1726882766.85268: stdout chunk (state=3): >>>: null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "relea<<< 28173 1726882766.85272: stdout chunk (state=3): >>>se": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28173 1726882766.86800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882766.86803: stdout chunk (state=3): >>><<< 28173 1726882766.86806: stderr chunk (state=3): >>><<< 28173 1726882766.87133: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882766.90857: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882766.2178338-29034-21965005863203/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882766.90888: _low_level_execute_command(): starting 28173 1726882766.90901: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882766.2178338-29034-21965005863203/ > /dev/null 2>&1 && sleep 0' 28173 1726882766.91831: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882766.91846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882766.91861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882766.91884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882766.91929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882766.91943: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882766.91957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882766.91977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882766.91988: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882766.91998: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882766.92008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882766.92021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882766.92037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882766.92050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882766.92061: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882766.92079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882766.92152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882766.92178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882766.92195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882766.92324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882766.94240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882766.94243: stdout chunk (state=3): >>><<< 28173 1726882766.94251: stderr chunk (state=3): >>><<< 28173 1726882766.94571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882766.94574: handler run complete 28173 1726882766.95348: variable 'ansible_facts' from source: unknown 28173 1726882766.96479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882766.98642: variable 'ansible_facts' from source: unknown 28173 1726882766.99125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882766.99950: attempt loop complete, returning result 28173 1726882766.99969: _execute() done 28173 1726882766.99976: dumping result to json 28173 1726882767.00216: done dumping result, returning 28173 1726882767.00228: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-926c-8928-0000000005e5] 28173 1726882767.00236: sending task result for task 0e448fcc-3ce9-926c-8928-0000000005e5 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882767.02760: no more pending results, returning what we have 28173 1726882767.02763: results queue empty 28173 1726882767.02765: checking for any_errors_fatal 28173 1726882767.02772: done checking for any_errors_fatal 28173 1726882767.02773: checking for max_fail_percentage 28173 1726882767.02774: done checking for max_fail_percentage 28173 1726882767.02775: checking to see if all hosts have failed and the running result is not ok 28173 1726882767.02776: done checking to see if all hosts have failed 28173 1726882767.02777: getting the remaining hosts for this loop 28173 1726882767.02779: done getting the remaining hosts for this loop 28173 1726882767.02782: getting the next task for host managed_node2 28173 1726882767.02788: done getting next task for host managed_node2 28173 1726882767.02792: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28173 1726882767.02795: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882767.02805: getting variables 28173 1726882767.02807: in VariableManager get_vars() 28173 1726882767.02840: Calling all_inventory to load vars for managed_node2 28173 1726882767.02842: Calling groups_inventory to load vars for managed_node2 28173 1726882767.02844: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882767.02853: Calling all_plugins_play to load vars for managed_node2 28173 1726882767.02856: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882767.02858: Calling groups_plugins_play to load vars for managed_node2 28173 1726882767.03805: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000005e5 28173 1726882767.03809: WORKER PROCESS EXITING 28173 1726882767.04171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882767.05921: done with get_vars() 28173 1726882767.05948: done getting variables 28173 1726882767.06007: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:39:27 -0400 (0:00:00.939) 0:00:20.224 ****** 28173 1726882767.06041: entering _queue_task() for managed_node2/debug 28173 1726882767.06327: worker is 1 (out of 1 available) 28173 1726882767.06339: exiting _queue_task() for managed_node2/debug 28173 1726882767.06352: done queuing things up, now waiting for results queue to drain 28173 1726882767.06353: waiting for pending results... 28173 1726882767.06670: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28173 1726882767.06762: in run() - task 0e448fcc-3ce9-926c-8928-000000000068 28173 1726882767.06777: variable 'ansible_search_path' from source: unknown 28173 1726882767.06781: variable 'ansible_search_path' from source: unknown 28173 1726882767.06810: calling self._execute() 28173 1726882767.06891: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.06895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.06901: variable 'omit' from source: magic vars 28173 1726882767.07186: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.07196: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882767.07203: variable 'omit' from source: magic vars 28173 1726882767.07244: variable 'omit' from source: magic vars 28173 1726882767.07316: variable 'network_provider' from source: set_fact 28173 1726882767.07329: variable 'omit' from source: magic vars 28173 1726882767.07362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882767.07393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882767.07408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882767.07421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882767.07431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882767.07455: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882767.07458: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.07461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.07530: Set connection var ansible_pipelining to False 28173 1726882767.07533: Set connection var ansible_shell_type to sh 28173 1726882767.07540: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882767.07546: Set connection var ansible_timeout to 10 28173 1726882767.07551: Set connection var ansible_shell_executable to /bin/sh 28173 1726882767.07561: Set connection var ansible_connection to ssh 28173 1726882767.07581: variable 'ansible_shell_executable' from source: unknown 28173 1726882767.07585: variable 'ansible_connection' from source: unknown 28173 1726882767.07588: variable 'ansible_module_compression' from source: unknown 28173 1726882767.07590: variable 'ansible_shell_type' from source: unknown 28173 1726882767.07592: variable 'ansible_shell_executable' from source: unknown 28173 1726882767.07597: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.07602: variable 'ansible_pipelining' from source: unknown 28173 1726882767.07604: variable 'ansible_timeout' from source: unknown 28173 1726882767.07608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.07713: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882767.07721: variable 'omit' from source: magic vars 28173 1726882767.07725: starting attempt loop 28173 1726882767.07728: running the handler 28173 1726882767.07765: handler run complete 28173 1726882767.07779: attempt loop complete, returning result 28173 1726882767.07783: _execute() done 28173 1726882767.07785: dumping result to json 28173 1726882767.07787: done dumping result, returning 28173 1726882767.07791: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-926c-8928-000000000068] 28173 1726882767.07798: sending task result for task 0e448fcc-3ce9-926c-8928-000000000068 28173 1726882767.07883: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000068 28173 1726882767.07886: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28173 1726882767.07946: no more pending results, returning what we have 28173 1726882767.07949: results queue empty 28173 1726882767.07950: checking for any_errors_fatal 28173 1726882767.07960: done checking for any_errors_fatal 28173 1726882767.07960: checking for max_fail_percentage 28173 1726882767.07961: done checking for max_fail_percentage 28173 1726882767.07962: checking to see if all hosts have failed and the running result is not ok 28173 1726882767.07970: done checking to see if all hosts have failed 28173 1726882767.07971: getting the remaining hosts for this loop 28173 1726882767.07973: done getting the remaining hosts for this loop 28173 1726882767.07976: getting the next task for host managed_node2 28173 1726882767.07981: done getting next task for host managed_node2 28173 1726882767.07986: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28173 1726882767.07988: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882767.07998: getting variables 28173 1726882767.07999: in VariableManager get_vars() 28173 1726882767.08032: Calling all_inventory to load vars for managed_node2 28173 1726882767.08034: Calling groups_inventory to load vars for managed_node2 28173 1726882767.08036: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882767.08044: Calling all_plugins_play to load vars for managed_node2 28173 1726882767.08046: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882767.08049: Calling groups_plugins_play to load vars for managed_node2 28173 1726882767.09191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882767.10354: done with get_vars() 28173 1726882767.10375: done getting variables 28173 1726882767.10420: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:39:27 -0400 (0:00:00.044) 0:00:20.268 ****** 28173 1726882767.10446: entering _queue_task() for managed_node2/fail 28173 1726882767.10667: worker is 1 (out of 1 available) 28173 1726882767.10684: exiting _queue_task() for managed_node2/fail 28173 1726882767.10696: done queuing things up, now waiting for results queue to drain 28173 1726882767.10697: waiting for pending results... 28173 1726882767.10890: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28173 1726882767.10981: in run() - task 0e448fcc-3ce9-926c-8928-000000000069 28173 1726882767.10996: variable 'ansible_search_path' from source: unknown 28173 1726882767.11000: variable 'ansible_search_path' from source: unknown 28173 1726882767.11029: calling self._execute() 28173 1726882767.11104: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.11108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.11117: variable 'omit' from source: magic vars 28173 1726882767.11404: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.11414: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882767.11504: variable 'network_state' from source: role '' defaults 28173 1726882767.11512: Evaluated conditional (network_state != {}): False 28173 1726882767.11515: when evaluation is False, skipping this task 28173 1726882767.11518: _execute() done 28173 1726882767.11520: dumping result to json 28173 1726882767.11523: done dumping result, returning 28173 1726882767.11529: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-926c-8928-000000000069] 28173 1726882767.11535: sending task result for task 0e448fcc-3ce9-926c-8928-000000000069 28173 1726882767.11654: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000069 28173 1726882767.11657: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882767.11890: no more pending results, returning what we have 28173 1726882767.11893: results queue empty 28173 1726882767.11894: checking for any_errors_fatal 28173 1726882767.11899: done checking for any_errors_fatal 28173 1726882767.11900: checking for max_fail_percentage 28173 1726882767.11902: done checking for max_fail_percentage 28173 1726882767.11903: checking to see if all hosts have failed and the running result is not ok 28173 1726882767.11904: done checking to see if all hosts have failed 28173 1726882767.11905: getting the remaining hosts for this loop 28173 1726882767.11906: done getting the remaining hosts for this loop 28173 1726882767.11909: getting the next task for host managed_node2 28173 1726882767.11915: done getting next task for host managed_node2 28173 1726882767.11919: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28173 1726882767.11922: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882767.11937: getting variables 28173 1726882767.11939: in VariableManager get_vars() 28173 1726882767.11984: Calling all_inventory to load vars for managed_node2 28173 1726882767.11987: Calling groups_inventory to load vars for managed_node2 28173 1726882767.11989: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882767.11997: Calling all_plugins_play to load vars for managed_node2 28173 1726882767.12000: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882767.12002: Calling groups_plugins_play to load vars for managed_node2 28173 1726882767.13530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882767.14533: done with get_vars() 28173 1726882767.14551: done getting variables 28173 1726882767.14593: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:39:27 -0400 (0:00:00.041) 0:00:20.310 ****** 28173 1726882767.14616: entering _queue_task() for managed_node2/fail 28173 1726882767.14811: worker is 1 (out of 1 available) 28173 1726882767.14825: exiting _queue_task() for managed_node2/fail 28173 1726882767.14837: done queuing things up, now waiting for results queue to drain 28173 1726882767.14839: waiting for pending results... 28173 1726882767.15022: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28173 1726882767.15124: in run() - task 0e448fcc-3ce9-926c-8928-00000000006a 28173 1726882767.15136: variable 'ansible_search_path' from source: unknown 28173 1726882767.15139: variable 'ansible_search_path' from source: unknown 28173 1726882767.15171: calling self._execute() 28173 1726882767.15245: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.15249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.15257: variable 'omit' from source: magic vars 28173 1726882767.15570: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.15579: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882767.15684: variable 'network_state' from source: role '' defaults 28173 1726882767.15695: Evaluated conditional (network_state != {}): False 28173 1726882767.15699: when evaluation is False, skipping this task 28173 1726882767.15701: _execute() done 28173 1726882767.15705: dumping result to json 28173 1726882767.15708: done dumping result, returning 28173 1726882767.15730: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-926c-8928-00000000006a] 28173 1726882767.15744: sending task result for task 0e448fcc-3ce9-926c-8928-00000000006a 28173 1726882767.16029: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000006a 28173 1726882767.16033: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882767.16082: no more pending results, returning what we have 28173 1726882767.16085: results queue empty 28173 1726882767.16086: checking for any_errors_fatal 28173 1726882767.16092: done checking for any_errors_fatal 28173 1726882767.16092: checking for max_fail_percentage 28173 1726882767.16094: done checking for max_fail_percentage 28173 1726882767.16095: checking to see if all hosts have failed and the running result is not ok 28173 1726882767.16096: done checking to see if all hosts have failed 28173 1726882767.16097: getting the remaining hosts for this loop 28173 1726882767.16098: done getting the remaining hosts for this loop 28173 1726882767.16101: getting the next task for host managed_node2 28173 1726882767.16106: done getting next task for host managed_node2 28173 1726882767.16112: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28173 1726882767.16115: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882767.16131: getting variables 28173 1726882767.16133: in VariableManager get_vars() 28173 1726882767.16182: Calling all_inventory to load vars for managed_node2 28173 1726882767.16185: Calling groups_inventory to load vars for managed_node2 28173 1726882767.16187: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882767.16196: Calling all_plugins_play to load vars for managed_node2 28173 1726882767.16199: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882767.16202: Calling groups_plugins_play to load vars for managed_node2 28173 1726882767.17508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882767.18753: done with get_vars() 28173 1726882767.18772: done getting variables 28173 1726882767.18812: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:39:27 -0400 (0:00:00.042) 0:00:20.352 ****** 28173 1726882767.18835: entering _queue_task() for managed_node2/fail 28173 1726882767.19016: worker is 1 (out of 1 available) 28173 1726882767.19030: exiting _queue_task() for managed_node2/fail 28173 1726882767.19042: done queuing things up, now waiting for results queue to drain 28173 1726882767.19043: waiting for pending results... 28173 1726882767.19223: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28173 1726882767.19316: in run() - task 0e448fcc-3ce9-926c-8928-00000000006b 28173 1726882767.19327: variable 'ansible_search_path' from source: unknown 28173 1726882767.19331: variable 'ansible_search_path' from source: unknown 28173 1726882767.19358: calling self._execute() 28173 1726882767.19431: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.19435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.19440: variable 'omit' from source: magic vars 28173 1726882767.19708: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.19718: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882767.19835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882767.25529: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882767.25577: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882767.25603: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882767.25629: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882767.25648: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882767.25699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.25720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.25739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.25769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.25778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.25841: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.25851: Evaluated conditional (ansible_distribution_major_version | int > 9): False 28173 1726882767.25854: when evaluation is False, skipping this task 28173 1726882767.25857: _execute() done 28173 1726882767.25860: dumping result to json 28173 1726882767.25862: done dumping result, returning 28173 1726882767.25870: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-926c-8928-00000000006b] 28173 1726882767.25873: sending task result for task 0e448fcc-3ce9-926c-8928-00000000006b 28173 1726882767.25955: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000006b 28173 1726882767.25958: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 28173 1726882767.26003: no more pending results, returning what we have 28173 1726882767.26007: results queue empty 28173 1726882767.26007: checking for any_errors_fatal 28173 1726882767.26014: done checking for any_errors_fatal 28173 1726882767.26014: checking for max_fail_percentage 28173 1726882767.26016: done checking for max_fail_percentage 28173 1726882767.26017: checking to see if all hosts have failed and the running result is not ok 28173 1726882767.26018: done checking to see if all hosts have failed 28173 1726882767.26018: getting the remaining hosts for this loop 28173 1726882767.26020: done getting the remaining hosts for this loop 28173 1726882767.26023: getting the next task for host managed_node2 28173 1726882767.26028: done getting next task for host managed_node2 28173 1726882767.26032: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28173 1726882767.26035: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882767.26052: getting variables 28173 1726882767.26053: in VariableManager get_vars() 28173 1726882767.26096: Calling all_inventory to load vars for managed_node2 28173 1726882767.26099: Calling groups_inventory to load vars for managed_node2 28173 1726882767.26100: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882767.26109: Calling all_plugins_play to load vars for managed_node2 28173 1726882767.26111: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882767.26113: Calling groups_plugins_play to load vars for managed_node2 28173 1726882767.30028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882767.30959: done with get_vars() 28173 1726882767.30977: done getting variables 28173 1726882767.31010: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:39:27 -0400 (0:00:00.121) 0:00:20.474 ****** 28173 1726882767.31029: entering _queue_task() for managed_node2/dnf 28173 1726882767.31251: worker is 1 (out of 1 available) 28173 1726882767.31265: exiting _queue_task() for managed_node2/dnf 28173 1726882767.31280: done queuing things up, now waiting for results queue to drain 28173 1726882767.31281: waiting for pending results... 28173 1726882767.31457: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28173 1726882767.31549: in run() - task 0e448fcc-3ce9-926c-8928-00000000006c 28173 1726882767.31567: variable 'ansible_search_path' from source: unknown 28173 1726882767.31571: variable 'ansible_search_path' from source: unknown 28173 1726882767.31602: calling self._execute() 28173 1726882767.31682: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.31687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.31695: variable 'omit' from source: magic vars 28173 1726882767.31961: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.31976: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882767.32115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882767.33701: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882767.33756: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882767.33801: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882767.33832: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882767.33857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882767.33915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.33934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.33955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.33984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.33997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.34079: variable 'ansible_distribution' from source: facts 28173 1726882767.34084: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.34095: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28173 1726882767.34171: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882767.34251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.34273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.34292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.34318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.34329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.34357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.34377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.34395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.34419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.34431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.34457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.34475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.34493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.34519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.34530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.34628: variable 'network_connections' from source: task vars 28173 1726882767.34636: variable 'interface' from source: set_fact 28173 1726882767.34688: variable 'interface' from source: set_fact 28173 1726882767.34694: variable 'interface' from source: set_fact 28173 1726882767.34784: variable 'interface' from source: set_fact 28173 1726882767.34888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882767.35109: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882767.35168: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882767.35204: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882767.35235: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882767.35294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882767.35319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882767.35365: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.35402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882767.35458: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882767.35856: variable 'network_connections' from source: task vars 28173 1726882767.35860: variable 'interface' from source: set_fact 28173 1726882767.35912: variable 'interface' from source: set_fact 28173 1726882767.35937: variable 'interface' from source: set_fact 28173 1726882767.35981: variable 'interface' from source: set_fact 28173 1726882767.36008: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28173 1726882767.36011: when evaluation is False, skipping this task 28173 1726882767.36015: _execute() done 28173 1726882767.36018: dumping result to json 28173 1726882767.36020: done dumping result, returning 28173 1726882767.36023: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-00000000006c] 28173 1726882767.36032: sending task result for task 0e448fcc-3ce9-926c-8928-00000000006c 28173 1726882767.36117: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000006c 28173 1726882767.36119: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28173 1726882767.36173: no more pending results, returning what we have 28173 1726882767.36177: results queue empty 28173 1726882767.36178: checking for any_errors_fatal 28173 1726882767.36184: done checking for any_errors_fatal 28173 1726882767.36185: checking for max_fail_percentage 28173 1726882767.36187: done checking for max_fail_percentage 28173 1726882767.36188: checking to see if all hosts have failed and the running result is not ok 28173 1726882767.36189: done checking to see if all hosts have failed 28173 1726882767.36189: getting the remaining hosts for this loop 28173 1726882767.36191: done getting the remaining hosts for this loop 28173 1726882767.36194: getting the next task for host managed_node2 28173 1726882767.36200: done getting next task for host managed_node2 28173 1726882767.36204: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28173 1726882767.36206: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882767.36227: getting variables 28173 1726882767.36229: in VariableManager get_vars() 28173 1726882767.36268: Calling all_inventory to load vars for managed_node2 28173 1726882767.36271: Calling groups_inventory to load vars for managed_node2 28173 1726882767.36273: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882767.36283: Calling all_plugins_play to load vars for managed_node2 28173 1726882767.36286: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882767.36288: Calling groups_plugins_play to load vars for managed_node2 28173 1726882767.37185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882767.38604: done with get_vars() 28173 1726882767.38625: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28173 1726882767.38701: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:39:27 -0400 (0:00:00.076) 0:00:20.551 ****** 28173 1726882767.38732: entering _queue_task() for managed_node2/yum 28173 1726882767.39020: worker is 1 (out of 1 available) 28173 1726882767.39032: exiting _queue_task() for managed_node2/yum 28173 1726882767.39044: done queuing things up, now waiting for results queue to drain 28173 1726882767.39046: waiting for pending results... 28173 1726882767.39334: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28173 1726882767.39477: in run() - task 0e448fcc-3ce9-926c-8928-00000000006d 28173 1726882767.39505: variable 'ansible_search_path' from source: unknown 28173 1726882767.39514: variable 'ansible_search_path' from source: unknown 28173 1726882767.39558: calling self._execute() 28173 1726882767.39670: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.39682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.39696: variable 'omit' from source: magic vars 28173 1726882767.40106: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.40125: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882767.40320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882767.42599: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882767.42685: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882767.42728: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882767.42776: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882767.42809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882767.42896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.42930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.42968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.43014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.43034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.43144: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.43170: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28173 1726882767.43182: when evaluation is False, skipping this task 28173 1726882767.43190: _execute() done 28173 1726882767.43199: dumping result to json 28173 1726882767.43208: done dumping result, returning 28173 1726882767.43221: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-00000000006d] 28173 1726882767.43235: sending task result for task 0e448fcc-3ce9-926c-8928-00000000006d skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28173 1726882767.43388: no more pending results, returning what we have 28173 1726882767.43392: results queue empty 28173 1726882767.43393: checking for any_errors_fatal 28173 1726882767.43406: done checking for any_errors_fatal 28173 1726882767.43407: checking for max_fail_percentage 28173 1726882767.43409: done checking for max_fail_percentage 28173 1726882767.43410: checking to see if all hosts have failed and the running result is not ok 28173 1726882767.43411: done checking to see if all hosts have failed 28173 1726882767.43412: getting the remaining hosts for this loop 28173 1726882767.43414: done getting the remaining hosts for this loop 28173 1726882767.43417: getting the next task for host managed_node2 28173 1726882767.43424: done getting next task for host managed_node2 28173 1726882767.43428: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28173 1726882767.43431: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882767.43453: getting variables 28173 1726882767.43455: in VariableManager get_vars() 28173 1726882767.43505: Calling all_inventory to load vars for managed_node2 28173 1726882767.43508: Calling groups_inventory to load vars for managed_node2 28173 1726882767.43511: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882767.43522: Calling all_plugins_play to load vars for managed_node2 28173 1726882767.43525: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882767.43529: Calling groups_plugins_play to load vars for managed_node2 28173 1726882767.44484: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000006d 28173 1726882767.44488: WORKER PROCESS EXITING 28173 1726882767.45288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882767.47077: done with get_vars() 28173 1726882767.47098: done getting variables 28173 1726882767.47154: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:39:27 -0400 (0:00:00.084) 0:00:20.636 ****** 28173 1726882767.47191: entering _queue_task() for managed_node2/fail 28173 1726882767.47447: worker is 1 (out of 1 available) 28173 1726882767.47459: exiting _queue_task() for managed_node2/fail 28173 1726882767.47474: done queuing things up, now waiting for results queue to drain 28173 1726882767.47476: waiting for pending results... 28173 1726882767.47751: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28173 1726882767.47907: in run() - task 0e448fcc-3ce9-926c-8928-00000000006e 28173 1726882767.47930: variable 'ansible_search_path' from source: unknown 28173 1726882767.47938: variable 'ansible_search_path' from source: unknown 28173 1726882767.47982: calling self._execute() 28173 1726882767.48077: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.48088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.48100: variable 'omit' from source: magic vars 28173 1726882767.48469: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.48487: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882767.48614: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882767.48814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882767.51662: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882767.51739: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882767.51800: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882767.51845: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882767.51881: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882767.51958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.51997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.52029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.52085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.52104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.52149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.52186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.52216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.52260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.52289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.52332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.52360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.52397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.52439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.52455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.52640: variable 'network_connections' from source: task vars 28173 1726882767.52654: variable 'interface' from source: set_fact 28173 1726882767.52736: variable 'interface' from source: set_fact 28173 1726882767.52749: variable 'interface' from source: set_fact 28173 1726882767.52819: variable 'interface' from source: set_fact 28173 1726882767.52900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882767.53078: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882767.53119: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882767.53157: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882767.53204: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882767.53252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882767.53284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882767.53312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.53340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882767.53407: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882767.53654: variable 'network_connections' from source: task vars 28173 1726882767.53668: variable 'interface' from source: set_fact 28173 1726882767.53734: variable 'interface' from source: set_fact 28173 1726882767.53744: variable 'interface' from source: set_fact 28173 1726882767.53814: variable 'interface' from source: set_fact 28173 1726882767.53854: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28173 1726882767.53861: when evaluation is False, skipping this task 28173 1726882767.53873: _execute() done 28173 1726882767.53881: dumping result to json 28173 1726882767.53888: done dumping result, returning 28173 1726882767.53902: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-00000000006e] 28173 1726882767.53920: sending task result for task 0e448fcc-3ce9-926c-8928-00000000006e skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28173 1726882767.54068: no more pending results, returning what we have 28173 1726882767.54071: results queue empty 28173 1726882767.54072: checking for any_errors_fatal 28173 1726882767.54077: done checking for any_errors_fatal 28173 1726882767.54078: checking for max_fail_percentage 28173 1726882767.54080: done checking for max_fail_percentage 28173 1726882767.54081: checking to see if all hosts have failed and the running result is not ok 28173 1726882767.54082: done checking to see if all hosts have failed 28173 1726882767.54082: getting the remaining hosts for this loop 28173 1726882767.54084: done getting the remaining hosts for this loop 28173 1726882767.54088: getting the next task for host managed_node2 28173 1726882767.54094: done getting next task for host managed_node2 28173 1726882767.54098: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28173 1726882767.54101: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882767.54119: getting variables 28173 1726882767.54121: in VariableManager get_vars() 28173 1726882767.54160: Calling all_inventory to load vars for managed_node2 28173 1726882767.54165: Calling groups_inventory to load vars for managed_node2 28173 1726882767.54170: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882767.54184: Calling all_plugins_play to load vars for managed_node2 28173 1726882767.54188: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882767.54191: Calling groups_plugins_play to load vars for managed_node2 28173 1726882767.55187: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000006e 28173 1726882767.55190: WORKER PROCESS EXITING 28173 1726882767.56288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882767.58193: done with get_vars() 28173 1726882767.58215: done getting variables 28173 1726882767.58275: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:39:27 -0400 (0:00:00.111) 0:00:20.747 ****** 28173 1726882767.58309: entering _queue_task() for managed_node2/package 28173 1726882767.58575: worker is 1 (out of 1 available) 28173 1726882767.58588: exiting _queue_task() for managed_node2/package 28173 1726882767.58600: done queuing things up, now waiting for results queue to drain 28173 1726882767.58601: waiting for pending results... 28173 1726882767.58892: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28173 1726882767.59024: in run() - task 0e448fcc-3ce9-926c-8928-00000000006f 28173 1726882767.59048: variable 'ansible_search_path' from source: unknown 28173 1726882767.59056: variable 'ansible_search_path' from source: unknown 28173 1726882767.59098: calling self._execute() 28173 1726882767.59195: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.59205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.59220: variable 'omit' from source: magic vars 28173 1726882767.59591: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.59607: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882767.59800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882767.60066: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882767.60114: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882767.60155: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882767.60216: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882767.60336: variable 'network_packages' from source: role '' defaults 28173 1726882767.60449: variable '__network_provider_setup' from source: role '' defaults 28173 1726882767.60472: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882767.60542: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882767.60555: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882767.60628: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882767.60812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882767.63214: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882767.63293: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882767.63332: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882767.63375: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882767.63410: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882767.63505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.63539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.63573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.63622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.63639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.63690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.63723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.63750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.63797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.63814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.64054: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28173 1726882767.64173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.64202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.64229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.64280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.64297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.64394: variable 'ansible_python' from source: facts 28173 1726882767.64423: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28173 1726882767.64511: variable '__network_wpa_supplicant_required' from source: role '' defaults 28173 1726882767.64600: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28173 1726882767.64733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.64761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.64798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.64841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.64859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.64913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.64948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.64981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.65027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.65044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.65200: variable 'network_connections' from source: task vars 28173 1726882767.65211: variable 'interface' from source: set_fact 28173 1726882767.65318: variable 'interface' from source: set_fact 28173 1726882767.65331: variable 'interface' from source: set_fact 28173 1726882767.65438: variable 'interface' from source: set_fact 28173 1726882767.65523: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882767.65558: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882767.65597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.65632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882767.65689: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882767.65982: variable 'network_connections' from source: task vars 28173 1726882767.65995: variable 'interface' from source: set_fact 28173 1726882767.66103: variable 'interface' from source: set_fact 28173 1726882767.66116: variable 'interface' from source: set_fact 28173 1726882767.66225: variable 'interface' from source: set_fact 28173 1726882767.66287: variable '__network_packages_default_wireless' from source: role '' defaults 28173 1726882767.66375: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882767.66701: variable 'network_connections' from source: task vars 28173 1726882767.66711: variable 'interface' from source: set_fact 28173 1726882767.66785: variable 'interface' from source: set_fact 28173 1726882767.66796: variable 'interface' from source: set_fact 28173 1726882767.66868: variable 'interface' from source: set_fact 28173 1726882767.66903: variable '__network_packages_default_team' from source: role '' defaults 28173 1726882767.66991: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882767.67325: variable 'network_connections' from source: task vars 28173 1726882767.67335: variable 'interface' from source: set_fact 28173 1726882767.67409: variable 'interface' from source: set_fact 28173 1726882767.67420: variable 'interface' from source: set_fact 28173 1726882767.67490: variable 'interface' from source: set_fact 28173 1726882767.67568: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882767.67635: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882767.67647: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882767.67714: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882767.67948: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28173 1726882767.68648: variable 'network_connections' from source: task vars 28173 1726882767.68658: variable 'interface' from source: set_fact 28173 1726882767.68731: variable 'interface' from source: set_fact 28173 1726882767.68743: variable 'interface' from source: set_fact 28173 1726882767.68808: variable 'interface' from source: set_fact 28173 1726882767.68832: variable 'ansible_distribution' from source: facts 28173 1726882767.68841: variable '__network_rh_distros' from source: role '' defaults 28173 1726882767.68850: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.68880: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28173 1726882767.69058: variable 'ansible_distribution' from source: facts 28173 1726882767.69071: variable '__network_rh_distros' from source: role '' defaults 28173 1726882767.69083: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.69100: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28173 1726882767.69279: variable 'ansible_distribution' from source: facts 28173 1726882767.69288: variable '__network_rh_distros' from source: role '' defaults 28173 1726882767.69297: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.69335: variable 'network_provider' from source: set_fact 28173 1726882767.69354: variable 'ansible_facts' from source: unknown 28173 1726882767.70108: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28173 1726882767.70116: when evaluation is False, skipping this task 28173 1726882767.70124: _execute() done 28173 1726882767.70131: dumping result to json 28173 1726882767.70137: done dumping result, returning 28173 1726882767.70147: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-926c-8928-00000000006f] 28173 1726882767.70156: sending task result for task 0e448fcc-3ce9-926c-8928-00000000006f skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28173 1726882767.70310: no more pending results, returning what we have 28173 1726882767.70314: results queue empty 28173 1726882767.70315: checking for any_errors_fatal 28173 1726882767.70322: done checking for any_errors_fatal 28173 1726882767.70323: checking for max_fail_percentage 28173 1726882767.70325: done checking for max_fail_percentage 28173 1726882767.70326: checking to see if all hosts have failed and the running result is not ok 28173 1726882767.70327: done checking to see if all hosts have failed 28173 1726882767.70327: getting the remaining hosts for this loop 28173 1726882767.70329: done getting the remaining hosts for this loop 28173 1726882767.70333: getting the next task for host managed_node2 28173 1726882767.70340: done getting next task for host managed_node2 28173 1726882767.70344: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28173 1726882767.70348: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882767.70371: getting variables 28173 1726882767.70373: in VariableManager get_vars() 28173 1726882767.70415: Calling all_inventory to load vars for managed_node2 28173 1726882767.70421: Calling groups_inventory to load vars for managed_node2 28173 1726882767.70424: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882767.70435: Calling all_plugins_play to load vars for managed_node2 28173 1726882767.70438: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882767.70440: Calling groups_plugins_play to load vars for managed_node2 28173 1726882767.71685: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000006f 28173 1726882767.71689: WORKER PROCESS EXITING 28173 1726882767.72176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882767.74227: done with get_vars() 28173 1726882767.74250: done getting variables 28173 1726882767.74313: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:39:27 -0400 (0:00:00.160) 0:00:20.907 ****** 28173 1726882767.74348: entering _queue_task() for managed_node2/package 28173 1726882767.74621: worker is 1 (out of 1 available) 28173 1726882767.74633: exiting _queue_task() for managed_node2/package 28173 1726882767.74646: done queuing things up, now waiting for results queue to drain 28173 1726882767.74647: waiting for pending results... 28173 1726882767.74930: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28173 1726882767.75074: in run() - task 0e448fcc-3ce9-926c-8928-000000000070 28173 1726882767.75103: variable 'ansible_search_path' from source: unknown 28173 1726882767.75112: variable 'ansible_search_path' from source: unknown 28173 1726882767.75149: calling self._execute() 28173 1726882767.75251: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.75265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.75283: variable 'omit' from source: magic vars 28173 1726882767.75655: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.75677: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882767.75804: variable 'network_state' from source: role '' defaults 28173 1726882767.75819: Evaluated conditional (network_state != {}): False 28173 1726882767.75825: when evaluation is False, skipping this task 28173 1726882767.75831: _execute() done 28173 1726882767.75837: dumping result to json 28173 1726882767.75845: done dumping result, returning 28173 1726882767.75857: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-926c-8928-000000000070] 28173 1726882767.75873: sending task result for task 0e448fcc-3ce9-926c-8928-000000000070 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882767.76017: no more pending results, returning what we have 28173 1726882767.76021: results queue empty 28173 1726882767.76022: checking for any_errors_fatal 28173 1726882767.76029: done checking for any_errors_fatal 28173 1726882767.76030: checking for max_fail_percentage 28173 1726882767.76032: done checking for max_fail_percentage 28173 1726882767.76033: checking to see if all hosts have failed and the running result is not ok 28173 1726882767.76034: done checking to see if all hosts have failed 28173 1726882767.76035: getting the remaining hosts for this loop 28173 1726882767.76036: done getting the remaining hosts for this loop 28173 1726882767.76039: getting the next task for host managed_node2 28173 1726882767.76047: done getting next task for host managed_node2 28173 1726882767.76051: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28173 1726882767.76055: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882767.76078: getting variables 28173 1726882767.76080: in VariableManager get_vars() 28173 1726882767.76120: Calling all_inventory to load vars for managed_node2 28173 1726882767.76123: Calling groups_inventory to load vars for managed_node2 28173 1726882767.76125: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882767.76137: Calling all_plugins_play to load vars for managed_node2 28173 1726882767.76140: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882767.76143: Calling groups_plugins_play to load vars for managed_node2 28173 1726882767.77185: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000070 28173 1726882767.77188: WORKER PROCESS EXITING 28173 1726882767.77793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882767.79548: done with get_vars() 28173 1726882767.79573: done getting variables 28173 1726882767.79631: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:39:27 -0400 (0:00:00.053) 0:00:20.961 ****** 28173 1726882767.79667: entering _queue_task() for managed_node2/package 28173 1726882767.79923: worker is 1 (out of 1 available) 28173 1726882767.79935: exiting _queue_task() for managed_node2/package 28173 1726882767.79947: done queuing things up, now waiting for results queue to drain 28173 1726882767.79948: waiting for pending results... 28173 1726882767.80227: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28173 1726882767.80361: in run() - task 0e448fcc-3ce9-926c-8928-000000000071 28173 1726882767.80390: variable 'ansible_search_path' from source: unknown 28173 1726882767.80398: variable 'ansible_search_path' from source: unknown 28173 1726882767.80435: calling self._execute() 28173 1726882767.80536: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.80547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.80560: variable 'omit' from source: magic vars 28173 1726882767.80927: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.80945: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882767.81189: variable 'network_state' from source: role '' defaults 28173 1726882767.81277: Evaluated conditional (network_state != {}): False 28173 1726882767.81285: when evaluation is False, skipping this task 28173 1726882767.81292: _execute() done 28173 1726882767.81299: dumping result to json 28173 1726882767.81306: done dumping result, returning 28173 1726882767.81316: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-926c-8928-000000000071] 28173 1726882767.81327: sending task result for task 0e448fcc-3ce9-926c-8928-000000000071 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882767.81473: no more pending results, returning what we have 28173 1726882767.81477: results queue empty 28173 1726882767.81478: checking for any_errors_fatal 28173 1726882767.81487: done checking for any_errors_fatal 28173 1726882767.81488: checking for max_fail_percentage 28173 1726882767.81489: done checking for max_fail_percentage 28173 1726882767.81490: checking to see if all hosts have failed and the running result is not ok 28173 1726882767.81491: done checking to see if all hosts have failed 28173 1726882767.81492: getting the remaining hosts for this loop 28173 1726882767.81494: done getting the remaining hosts for this loop 28173 1726882767.81498: getting the next task for host managed_node2 28173 1726882767.81503: done getting next task for host managed_node2 28173 1726882767.81507: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28173 1726882767.81511: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882767.81530: getting variables 28173 1726882767.81532: in VariableManager get_vars() 28173 1726882767.81575: Calling all_inventory to load vars for managed_node2 28173 1726882767.81578: Calling groups_inventory to load vars for managed_node2 28173 1726882767.81580: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882767.81593: Calling all_plugins_play to load vars for managed_node2 28173 1726882767.81596: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882767.81599: Calling groups_plugins_play to load vars for managed_node2 28173 1726882767.83212: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000071 28173 1726882767.83215: WORKER PROCESS EXITING 28173 1726882767.84034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882767.87232: done with get_vars() 28173 1726882767.87262: done getting variables 28173 1726882767.87343: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:39:27 -0400 (0:00:00.077) 0:00:21.038 ****** 28173 1726882767.87382: entering _queue_task() for managed_node2/service 28173 1726882767.87716: worker is 1 (out of 1 available) 28173 1726882767.87727: exiting _queue_task() for managed_node2/service 28173 1726882767.87739: done queuing things up, now waiting for results queue to drain 28173 1726882767.87741: waiting for pending results... 28173 1726882767.88145: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28173 1726882767.88302: in run() - task 0e448fcc-3ce9-926c-8928-000000000072 28173 1726882767.88322: variable 'ansible_search_path' from source: unknown 28173 1726882767.88329: variable 'ansible_search_path' from source: unknown 28173 1726882767.88372: calling self._execute() 28173 1726882767.88473: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.88484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.88496: variable 'omit' from source: magic vars 28173 1726882767.88994: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.89015: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882767.89141: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882767.89345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882767.91720: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882767.91796: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882767.91839: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882767.91882: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882767.91912: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882767.91996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.92034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.92070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.92116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.92134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.92189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.92215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.92242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.92295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.92313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.92353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882767.92389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882767.92420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.92470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882767.92492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882767.92671: variable 'network_connections' from source: task vars 28173 1726882767.92687: variable 'interface' from source: set_fact 28173 1726882767.92751: variable 'interface' from source: set_fact 28173 1726882767.92762: variable 'interface' from source: set_fact 28173 1726882767.92829: variable 'interface' from source: set_fact 28173 1726882767.92919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882767.93103: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882767.93149: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882767.93210: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882767.93247: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882767.93296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882767.93349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882767.93395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882767.93426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882767.93513: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882767.93769: variable 'network_connections' from source: task vars 28173 1726882767.93783: variable 'interface' from source: set_fact 28173 1726882767.93844: variable 'interface' from source: set_fact 28173 1726882767.93874: variable 'interface' from source: set_fact 28173 1726882767.93939: variable 'interface' from source: set_fact 28173 1726882767.93987: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28173 1726882767.93997: when evaluation is False, skipping this task 28173 1726882767.94004: _execute() done 28173 1726882767.94011: dumping result to json 28173 1726882767.94019: done dumping result, returning 28173 1726882767.94030: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-000000000072] 28173 1726882767.94056: sending task result for task 0e448fcc-3ce9-926c-8928-000000000072 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28173 1726882767.94212: no more pending results, returning what we have 28173 1726882767.94216: results queue empty 28173 1726882767.94217: checking for any_errors_fatal 28173 1726882767.94225: done checking for any_errors_fatal 28173 1726882767.94225: checking for max_fail_percentage 28173 1726882767.94227: done checking for max_fail_percentage 28173 1726882767.94228: checking to see if all hosts have failed and the running result is not ok 28173 1726882767.94230: done checking to see if all hosts have failed 28173 1726882767.94230: getting the remaining hosts for this loop 28173 1726882767.94232: done getting the remaining hosts for this loop 28173 1726882767.94236: getting the next task for host managed_node2 28173 1726882767.94243: done getting next task for host managed_node2 28173 1726882767.94247: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28173 1726882767.94250: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882767.94272: getting variables 28173 1726882767.94275: in VariableManager get_vars() 28173 1726882767.94318: Calling all_inventory to load vars for managed_node2 28173 1726882767.94321: Calling groups_inventory to load vars for managed_node2 28173 1726882767.94324: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882767.94335: Calling all_plugins_play to load vars for managed_node2 28173 1726882767.94338: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882767.94341: Calling groups_plugins_play to load vars for managed_node2 28173 1726882767.95984: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000072 28173 1726882767.95988: WORKER PROCESS EXITING 28173 1726882767.96236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882767.98159: done with get_vars() 28173 1726882767.98185: done getting variables 28173 1726882767.98243: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:39:27 -0400 (0:00:00.108) 0:00:21.147 ****** 28173 1726882767.98277: entering _queue_task() for managed_node2/service 28173 1726882767.98522: worker is 1 (out of 1 available) 28173 1726882767.98533: exiting _queue_task() for managed_node2/service 28173 1726882767.98546: done queuing things up, now waiting for results queue to drain 28173 1726882767.98547: waiting for pending results... 28173 1726882767.98826: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28173 1726882767.99017: in run() - task 0e448fcc-3ce9-926c-8928-000000000073 28173 1726882767.99039: variable 'ansible_search_path' from source: unknown 28173 1726882767.99047: variable 'ansible_search_path' from source: unknown 28173 1726882767.99095: calling self._execute() 28173 1726882767.99223: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882767.99235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882767.99248: variable 'omit' from source: magic vars 28173 1726882767.99683: variable 'ansible_distribution_major_version' from source: facts 28173 1726882767.99701: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882767.99869: variable 'network_provider' from source: set_fact 28173 1726882767.99888: variable 'network_state' from source: role '' defaults 28173 1726882767.99910: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28173 1726882767.99921: variable 'omit' from source: magic vars 28173 1726882767.99984: variable 'omit' from source: magic vars 28173 1726882768.00018: variable 'network_service_name' from source: role '' defaults 28173 1726882768.00108: variable 'network_service_name' from source: role '' defaults 28173 1726882768.00221: variable '__network_provider_setup' from source: role '' defaults 28173 1726882768.00231: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882768.00303: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882768.00316: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882768.00398: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882768.00681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882768.03121: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882768.03201: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882768.03247: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882768.03288: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882768.03317: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882768.03400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882768.03433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882768.03470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882768.03516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882768.03533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882768.03586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882768.03612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882768.03639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882768.03689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882768.03707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882768.03943: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28173 1726882768.04063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882768.04096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882768.04127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882768.04174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882768.04192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882768.04293: variable 'ansible_python' from source: facts 28173 1726882768.04325: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28173 1726882768.04412: variable '__network_wpa_supplicant_required' from source: role '' defaults 28173 1726882768.04501: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28173 1726882768.04619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882768.04653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882768.04692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882768.04737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882768.04761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882768.04818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882768.04893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882768.04921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882768.05192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882768.05211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882768.05398: variable 'network_connections' from source: task vars 28173 1726882768.05425: variable 'interface' from source: set_fact 28173 1726882768.05511: variable 'interface' from source: set_fact 28173 1726882768.05542: variable 'interface' from source: set_fact 28173 1726882768.05626: variable 'interface' from source: set_fact 28173 1726882768.05777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882768.05971: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882768.06026: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882768.06079: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882768.06128: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882768.06195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882768.06231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882768.06274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882768.06336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882768.06389: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882768.06854: variable 'network_connections' from source: task vars 28173 1726882768.06883: variable 'interface' from source: set_fact 28173 1726882768.06989: variable 'interface' from source: set_fact 28173 1726882768.07033: variable 'interface' from source: set_fact 28173 1726882768.07145: variable 'interface' from source: set_fact 28173 1726882768.07235: variable '__network_packages_default_wireless' from source: role '' defaults 28173 1726882768.07352: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882768.07673: variable 'network_connections' from source: task vars 28173 1726882768.07684: variable 'interface' from source: set_fact 28173 1726882768.07757: variable 'interface' from source: set_fact 28173 1726882768.07774: variable 'interface' from source: set_fact 28173 1726882768.07844: variable 'interface' from source: set_fact 28173 1726882768.07887: variable '__network_packages_default_team' from source: role '' defaults 28173 1726882768.07970: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882768.08414: variable 'network_connections' from source: task vars 28173 1726882768.08419: variable 'interface' from source: set_fact 28173 1726882768.08475: variable 'interface' from source: set_fact 28173 1726882768.08478: variable 'interface' from source: set_fact 28173 1726882768.08550: variable 'interface' from source: set_fact 28173 1726882768.08601: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882768.08644: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882768.08650: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882768.08700: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882768.09074: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28173 1726882768.09472: variable 'network_connections' from source: task vars 28173 1726882768.09476: variable 'interface' from source: set_fact 28173 1726882768.09479: variable 'interface' from source: set_fact 28173 1726882768.09481: variable 'interface' from source: set_fact 28173 1726882768.09513: variable 'interface' from source: set_fact 28173 1726882768.09525: variable 'ansible_distribution' from source: facts 28173 1726882768.09528: variable '__network_rh_distros' from source: role '' defaults 28173 1726882768.09533: variable 'ansible_distribution_major_version' from source: facts 28173 1726882768.09552: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28173 1726882768.09703: variable 'ansible_distribution' from source: facts 28173 1726882768.09707: variable '__network_rh_distros' from source: role '' defaults 28173 1726882768.09709: variable 'ansible_distribution_major_version' from source: facts 28173 1726882768.09722: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28173 1726882768.09876: variable 'ansible_distribution' from source: facts 28173 1726882768.09880: variable '__network_rh_distros' from source: role '' defaults 28173 1726882768.09885: variable 'ansible_distribution_major_version' from source: facts 28173 1726882768.09920: variable 'network_provider' from source: set_fact 28173 1726882768.09941: variable 'omit' from source: magic vars 28173 1726882768.09969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882768.09994: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882768.10009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882768.10024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882768.10034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882768.10060: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882768.10068: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882768.10071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882768.10169: Set connection var ansible_pipelining to False 28173 1726882768.10172: Set connection var ansible_shell_type to sh 28173 1726882768.10186: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882768.10193: Set connection var ansible_timeout to 10 28173 1726882768.10199: Set connection var ansible_shell_executable to /bin/sh 28173 1726882768.10204: Set connection var ansible_connection to ssh 28173 1726882768.10228: variable 'ansible_shell_executable' from source: unknown 28173 1726882768.10231: variable 'ansible_connection' from source: unknown 28173 1726882768.10234: variable 'ansible_module_compression' from source: unknown 28173 1726882768.10240: variable 'ansible_shell_type' from source: unknown 28173 1726882768.10278: variable 'ansible_shell_executable' from source: unknown 28173 1726882768.10280: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882768.10286: variable 'ansible_pipelining' from source: unknown 28173 1726882768.10288: variable 'ansible_timeout' from source: unknown 28173 1726882768.10289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882768.10535: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882768.10543: variable 'omit' from source: magic vars 28173 1726882768.10549: starting attempt loop 28173 1726882768.10551: running the handler 28173 1726882768.10626: variable 'ansible_facts' from source: unknown 28173 1726882768.11422: _low_level_execute_command(): starting 28173 1726882768.11428: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882768.12072: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882768.12083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.12111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.12134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.12172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882768.12180: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882768.12190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.12203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882768.12211: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882768.12218: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882768.12225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.12245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.12269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.12279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882768.12302: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882768.12312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.12406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882768.12421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882768.12424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882768.12661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882768.14330: stdout chunk (state=3): >>>/root <<< 28173 1726882768.14497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882768.14529: stderr chunk (state=3): >>><<< 28173 1726882768.14534: stdout chunk (state=3): >>><<< 28173 1726882768.14691: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882768.14694: _low_level_execute_command(): starting 28173 1726882768.14698: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882768.145587-29106-41119965958271 `" && echo ansible-tmp-1726882768.145587-29106-41119965958271="` echo /root/.ansible/tmp/ansible-tmp-1726882768.145587-29106-41119965958271 `" ) && sleep 0' 28173 1726882768.15203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882768.15216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.15226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.15240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.15281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882768.15287: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882768.15297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.15310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882768.15322: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882768.15329: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882768.15337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.15346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.15358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.15372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882768.15375: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882768.15383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.15455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882768.15474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882768.15485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882768.15614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882768.17540: stdout chunk (state=3): >>>ansible-tmp-1726882768.145587-29106-41119965958271=/root/.ansible/tmp/ansible-tmp-1726882768.145587-29106-41119965958271 <<< 28173 1726882768.17711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882768.17714: stdout chunk (state=3): >>><<< 28173 1726882768.17722: stderr chunk (state=3): >>><<< 28173 1726882768.17735: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882768.145587-29106-41119965958271=/root/.ansible/tmp/ansible-tmp-1726882768.145587-29106-41119965958271 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882768.17770: variable 'ansible_module_compression' from source: unknown 28173 1726882768.17819: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28173 1726882768.17879: variable 'ansible_facts' from source: unknown 28173 1726882768.18055: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882768.145587-29106-41119965958271/AnsiballZ_systemd.py 28173 1726882768.18203: Sending initial data 28173 1726882768.18206: Sent initial data (154 bytes) 28173 1726882768.19577: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882768.19580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.19583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.19585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.19587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882768.19590: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882768.19592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.19594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882768.19596: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882768.19598: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882768.19600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.19602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.19604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.19606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882768.19608: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882768.19610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.19612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882768.19614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882768.19616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882768.19807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882768.21575: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882768.21679: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882768.21778: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpk9fzlqq5 /root/.ansible/tmp/ansible-tmp-1726882768.145587-29106-41119965958271/AnsiballZ_systemd.py <<< 28173 1726882768.21879: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882768.24076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882768.24143: stderr chunk (state=3): >>><<< 28173 1726882768.24146: stdout chunk (state=3): >>><<< 28173 1726882768.24163: done transferring module to remote 28173 1726882768.24182: _low_level_execute_command(): starting 28173 1726882768.24185: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882768.145587-29106-41119965958271/ /root/.ansible/tmp/ansible-tmp-1726882768.145587-29106-41119965958271/AnsiballZ_systemd.py && sleep 0' 28173 1726882768.25077: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.25082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.25098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.25136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.25141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882768.25172: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 28173 1726882768.25185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.25203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.25213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.25218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.25288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882768.25292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882768.25416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882768.27216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882768.27256: stderr chunk (state=3): >>><<< 28173 1726882768.27260: stdout chunk (state=3): >>><<< 28173 1726882768.27304: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882768.27307: _low_level_execute_command(): starting 28173 1726882768.27312: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882768.145587-29106-41119965958271/AnsiballZ_systemd.py && sleep 0' 28173 1726882768.27910: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.27917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.27958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.27973: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.27995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.28001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.28077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882768.28082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882768.28098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882768.28233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882768.53227: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9211904", "MemoryAvailable": "infinity", "CPUUsageNSec": "1958294000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft"<<< 28173 1726882768.53255: stdout chunk (state=3): >>>: "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28173 1726882768.54809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882768.54867: stderr chunk (state=3): >>><<< 28173 1726882768.54871: stdout chunk (state=3): >>><<< 28173 1726882768.54887: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9211904", "MemoryAvailable": "infinity", "CPUUsageNSec": "1958294000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882768.54997: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882768.145587-29106-41119965958271/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882768.55013: _low_level_execute_command(): starting 28173 1726882768.55016: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882768.145587-29106-41119965958271/ > /dev/null 2>&1 && sleep 0' 28173 1726882768.55474: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.55481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.55490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.55520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.55533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882768.55543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.55591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882768.55603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882768.55707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882768.57512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882768.57555: stderr chunk (state=3): >>><<< 28173 1726882768.57558: stdout chunk (state=3): >>><<< 28173 1726882768.57574: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882768.57581: handler run complete 28173 1726882768.57616: attempt loop complete, returning result 28173 1726882768.57618: _execute() done 28173 1726882768.57621: dumping result to json 28173 1726882768.57631: done dumping result, returning 28173 1726882768.57640: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-926c-8928-000000000073] 28173 1726882768.57645: sending task result for task 0e448fcc-3ce9-926c-8928-000000000073 28173 1726882768.57866: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000073 28173 1726882768.57869: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882768.57921: no more pending results, returning what we have 28173 1726882768.57924: results queue empty 28173 1726882768.57925: checking for any_errors_fatal 28173 1726882768.57937: done checking for any_errors_fatal 28173 1726882768.57938: checking for max_fail_percentage 28173 1726882768.57940: done checking for max_fail_percentage 28173 1726882768.57941: checking to see if all hosts have failed and the running result is not ok 28173 1726882768.57942: done checking to see if all hosts have failed 28173 1726882768.57943: getting the remaining hosts for this loop 28173 1726882768.57944: done getting the remaining hosts for this loop 28173 1726882768.57948: getting the next task for host managed_node2 28173 1726882768.57954: done getting next task for host managed_node2 28173 1726882768.57958: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28173 1726882768.57961: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882768.57973: getting variables 28173 1726882768.57975: in VariableManager get_vars() 28173 1726882768.58009: Calling all_inventory to load vars for managed_node2 28173 1726882768.58011: Calling groups_inventory to load vars for managed_node2 28173 1726882768.58013: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882768.58023: Calling all_plugins_play to load vars for managed_node2 28173 1726882768.58025: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882768.58027: Calling groups_plugins_play to load vars for managed_node2 28173 1726882768.58840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882768.59780: done with get_vars() 28173 1726882768.59796: done getting variables 28173 1726882768.59841: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:39:28 -0400 (0:00:00.615) 0:00:21.763 ****** 28173 1726882768.59866: entering _queue_task() for managed_node2/service 28173 1726882768.60069: worker is 1 (out of 1 available) 28173 1726882768.60082: exiting _queue_task() for managed_node2/service 28173 1726882768.60095: done queuing things up, now waiting for results queue to drain 28173 1726882768.60096: waiting for pending results... 28173 1726882768.60282: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28173 1726882768.60379: in run() - task 0e448fcc-3ce9-926c-8928-000000000074 28173 1726882768.60391: variable 'ansible_search_path' from source: unknown 28173 1726882768.60394: variable 'ansible_search_path' from source: unknown 28173 1726882768.60422: calling self._execute() 28173 1726882768.60496: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882768.60501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882768.60509: variable 'omit' from source: magic vars 28173 1726882768.60781: variable 'ansible_distribution_major_version' from source: facts 28173 1726882768.60791: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882768.60871: variable 'network_provider' from source: set_fact 28173 1726882768.60877: Evaluated conditional (network_provider == "nm"): True 28173 1726882768.60941: variable '__network_wpa_supplicant_required' from source: role '' defaults 28173 1726882768.61003: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28173 1726882768.61118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882768.62790: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882768.62831: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882768.62856: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882768.62886: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882768.62905: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882768.62959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882768.62983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882768.63001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882768.63027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882768.63038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882768.63071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882768.63089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882768.63109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882768.63134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882768.63145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882768.63176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882768.63191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882768.63213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882768.63237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882768.63249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882768.63343: variable 'network_connections' from source: task vars 28173 1726882768.63352: variable 'interface' from source: set_fact 28173 1726882768.63403: variable 'interface' from source: set_fact 28173 1726882768.63416: variable 'interface' from source: set_fact 28173 1726882768.63454: variable 'interface' from source: set_fact 28173 1726882768.63525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882768.63626: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882768.63653: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882768.63679: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882768.63702: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882768.63731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882768.63749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882768.63768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882768.63789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882768.63824: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882768.63979: variable 'network_connections' from source: task vars 28173 1726882768.63983: variable 'interface' from source: set_fact 28173 1726882768.64026: variable 'interface' from source: set_fact 28173 1726882768.64031: variable 'interface' from source: set_fact 28173 1726882768.64079: variable 'interface' from source: set_fact 28173 1726882768.64109: Evaluated conditional (__network_wpa_supplicant_required): False 28173 1726882768.64112: when evaluation is False, skipping this task 28173 1726882768.64116: _execute() done 28173 1726882768.64127: dumping result to json 28173 1726882768.64130: done dumping result, returning 28173 1726882768.64132: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-926c-8928-000000000074] 28173 1726882768.64134: sending task result for task 0e448fcc-3ce9-926c-8928-000000000074 28173 1726882768.64219: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000074 28173 1726882768.64222: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28173 1726882768.64290: no more pending results, returning what we have 28173 1726882768.64297: results queue empty 28173 1726882768.64298: checking for any_errors_fatal 28173 1726882768.64312: done checking for any_errors_fatal 28173 1726882768.64313: checking for max_fail_percentage 28173 1726882768.64314: done checking for max_fail_percentage 28173 1726882768.64315: checking to see if all hosts have failed and the running result is not ok 28173 1726882768.64316: done checking to see if all hosts have failed 28173 1726882768.64316: getting the remaining hosts for this loop 28173 1726882768.64318: done getting the remaining hosts for this loop 28173 1726882768.64321: getting the next task for host managed_node2 28173 1726882768.64325: done getting next task for host managed_node2 28173 1726882768.64328: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28173 1726882768.64331: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882768.64345: getting variables 28173 1726882768.64347: in VariableManager get_vars() 28173 1726882768.64382: Calling all_inventory to load vars for managed_node2 28173 1726882768.64385: Calling groups_inventory to load vars for managed_node2 28173 1726882768.64387: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882768.64399: Calling all_plugins_play to load vars for managed_node2 28173 1726882768.64402: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882768.64405: Calling groups_plugins_play to load vars for managed_node2 28173 1726882768.65242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882768.66173: done with get_vars() 28173 1726882768.66187: done getting variables 28173 1726882768.66226: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:39:28 -0400 (0:00:00.063) 0:00:21.826 ****** 28173 1726882768.66248: entering _queue_task() for managed_node2/service 28173 1726882768.66433: worker is 1 (out of 1 available) 28173 1726882768.66446: exiting _queue_task() for managed_node2/service 28173 1726882768.66457: done queuing things up, now waiting for results queue to drain 28173 1726882768.66459: waiting for pending results... 28173 1726882768.66634: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28173 1726882768.66721: in run() - task 0e448fcc-3ce9-926c-8928-000000000075 28173 1726882768.66733: variable 'ansible_search_path' from source: unknown 28173 1726882768.66736: variable 'ansible_search_path' from source: unknown 28173 1726882768.66762: calling self._execute() 28173 1726882768.66836: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882768.66840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882768.66848: variable 'omit' from source: magic vars 28173 1726882768.67104: variable 'ansible_distribution_major_version' from source: facts 28173 1726882768.67122: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882768.67199: variable 'network_provider' from source: set_fact 28173 1726882768.67203: Evaluated conditional (network_provider == "initscripts"): False 28173 1726882768.67206: when evaluation is False, skipping this task 28173 1726882768.67208: _execute() done 28173 1726882768.67211: dumping result to json 28173 1726882768.67219: done dumping result, returning 28173 1726882768.67229: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-926c-8928-000000000075] 28173 1726882768.67235: sending task result for task 0e448fcc-3ce9-926c-8928-000000000075 28173 1726882768.67315: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000075 28173 1726882768.67321: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882768.67371: no more pending results, returning what we have 28173 1726882768.67375: results queue empty 28173 1726882768.67375: checking for any_errors_fatal 28173 1726882768.67381: done checking for any_errors_fatal 28173 1726882768.67382: checking for max_fail_percentage 28173 1726882768.67383: done checking for max_fail_percentage 28173 1726882768.67384: checking to see if all hosts have failed and the running result is not ok 28173 1726882768.67385: done checking to see if all hosts have failed 28173 1726882768.67386: getting the remaining hosts for this loop 28173 1726882768.67387: done getting the remaining hosts for this loop 28173 1726882768.67390: getting the next task for host managed_node2 28173 1726882768.67394: done getting next task for host managed_node2 28173 1726882768.67398: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28173 1726882768.67401: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882768.67416: getting variables 28173 1726882768.67417: in VariableManager get_vars() 28173 1726882768.67453: Calling all_inventory to load vars for managed_node2 28173 1726882768.67455: Calling groups_inventory to load vars for managed_node2 28173 1726882768.67457: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882768.67465: Calling all_plugins_play to load vars for managed_node2 28173 1726882768.67467: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882768.67470: Calling groups_plugins_play to load vars for managed_node2 28173 1726882768.68223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882768.69238: done with get_vars() 28173 1726882768.69252: done getting variables 28173 1726882768.69296: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:39:28 -0400 (0:00:00.030) 0:00:21.857 ****** 28173 1726882768.69318: entering _queue_task() for managed_node2/copy 28173 1726882768.69491: worker is 1 (out of 1 available) 28173 1726882768.69503: exiting _queue_task() for managed_node2/copy 28173 1726882768.69515: done queuing things up, now waiting for results queue to drain 28173 1726882768.69516: waiting for pending results... 28173 1726882768.69695: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28173 1726882768.69783: in run() - task 0e448fcc-3ce9-926c-8928-000000000076 28173 1726882768.69793: variable 'ansible_search_path' from source: unknown 28173 1726882768.69797: variable 'ansible_search_path' from source: unknown 28173 1726882768.69825: calling self._execute() 28173 1726882768.69902: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882768.69906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882768.69914: variable 'omit' from source: magic vars 28173 1726882768.70176: variable 'ansible_distribution_major_version' from source: facts 28173 1726882768.70186: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882768.70260: variable 'network_provider' from source: set_fact 28173 1726882768.70268: Evaluated conditional (network_provider == "initscripts"): False 28173 1726882768.70274: when evaluation is False, skipping this task 28173 1726882768.70278: _execute() done 28173 1726882768.70280: dumping result to json 28173 1726882768.70286: done dumping result, returning 28173 1726882768.70294: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-926c-8928-000000000076] 28173 1726882768.70300: sending task result for task 0e448fcc-3ce9-926c-8928-000000000076 28173 1726882768.70384: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000076 28173 1726882768.70387: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28173 1726882768.70431: no more pending results, returning what we have 28173 1726882768.70434: results queue empty 28173 1726882768.70435: checking for any_errors_fatal 28173 1726882768.70438: done checking for any_errors_fatal 28173 1726882768.70439: checking for max_fail_percentage 28173 1726882768.70440: done checking for max_fail_percentage 28173 1726882768.70441: checking to see if all hosts have failed and the running result is not ok 28173 1726882768.70442: done checking to see if all hosts have failed 28173 1726882768.70442: getting the remaining hosts for this loop 28173 1726882768.70444: done getting the remaining hosts for this loop 28173 1726882768.70446: getting the next task for host managed_node2 28173 1726882768.70451: done getting next task for host managed_node2 28173 1726882768.70454: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28173 1726882768.70457: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882768.70474: getting variables 28173 1726882768.70476: in VariableManager get_vars() 28173 1726882768.70508: Calling all_inventory to load vars for managed_node2 28173 1726882768.70511: Calling groups_inventory to load vars for managed_node2 28173 1726882768.70513: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882768.70520: Calling all_plugins_play to load vars for managed_node2 28173 1726882768.70522: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882768.70523: Calling groups_plugins_play to load vars for managed_node2 28173 1726882768.71265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882768.72195: done with get_vars() 28173 1726882768.72212: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:39:28 -0400 (0:00:00.029) 0:00:21.887 ****** 28173 1726882768.72267: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28173 1726882768.72429: worker is 1 (out of 1 available) 28173 1726882768.72442: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28173 1726882768.72455: done queuing things up, now waiting for results queue to drain 28173 1726882768.72456: waiting for pending results... 28173 1726882768.72624: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28173 1726882768.72712: in run() - task 0e448fcc-3ce9-926c-8928-000000000077 28173 1726882768.72723: variable 'ansible_search_path' from source: unknown 28173 1726882768.72727: variable 'ansible_search_path' from source: unknown 28173 1726882768.72756: calling self._execute() 28173 1726882768.72830: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882768.72835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882768.72841: variable 'omit' from source: magic vars 28173 1726882768.73099: variable 'ansible_distribution_major_version' from source: facts 28173 1726882768.73110: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882768.73116: variable 'omit' from source: magic vars 28173 1726882768.73150: variable 'omit' from source: magic vars 28173 1726882768.73257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882768.74968: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882768.75009: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882768.75035: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882768.75063: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882768.75089: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882768.75144: variable 'network_provider' from source: set_fact 28173 1726882768.75239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882768.75268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882768.75286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882768.75313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882768.75323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882768.75380: variable 'omit' from source: magic vars 28173 1726882768.75457: variable 'omit' from source: magic vars 28173 1726882768.75531: variable 'network_connections' from source: task vars 28173 1726882768.75539: variable 'interface' from source: set_fact 28173 1726882768.75590: variable 'interface' from source: set_fact 28173 1726882768.75596: variable 'interface' from source: set_fact 28173 1726882768.75638: variable 'interface' from source: set_fact 28173 1726882768.75790: variable 'omit' from source: magic vars 28173 1726882768.75793: variable '__lsr_ansible_managed' from source: task vars 28173 1726882768.75837: variable '__lsr_ansible_managed' from source: task vars 28173 1726882768.76018: Loaded config def from plugin (lookup/template) 28173 1726882768.76021: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28173 1726882768.76039: File lookup term: get_ansible_managed.j2 28173 1726882768.76043: variable 'ansible_search_path' from source: unknown 28173 1726882768.76047: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28173 1726882768.76058: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28173 1726882768.76073: variable 'ansible_search_path' from source: unknown 28173 1726882768.79457: variable 'ansible_managed' from source: unknown 28173 1726882768.79536: variable 'omit' from source: magic vars 28173 1726882768.79556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882768.79579: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882768.79593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882768.79607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882768.79615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882768.79636: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882768.79639: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882768.79642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882768.79705: Set connection var ansible_pipelining to False 28173 1726882768.79708: Set connection var ansible_shell_type to sh 28173 1726882768.79714: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882768.79722: Set connection var ansible_timeout to 10 28173 1726882768.79727: Set connection var ansible_shell_executable to /bin/sh 28173 1726882768.79732: Set connection var ansible_connection to ssh 28173 1726882768.79747: variable 'ansible_shell_executable' from source: unknown 28173 1726882768.79749: variable 'ansible_connection' from source: unknown 28173 1726882768.79752: variable 'ansible_module_compression' from source: unknown 28173 1726882768.79754: variable 'ansible_shell_type' from source: unknown 28173 1726882768.79757: variable 'ansible_shell_executable' from source: unknown 28173 1726882768.79759: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882768.79762: variable 'ansible_pipelining' from source: unknown 28173 1726882768.79768: variable 'ansible_timeout' from source: unknown 28173 1726882768.79771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882768.79856: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882768.79871: variable 'omit' from source: magic vars 28173 1726882768.79874: starting attempt loop 28173 1726882768.79877: running the handler 28173 1726882768.79887: _low_level_execute_command(): starting 28173 1726882768.79897: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882768.80398: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.80413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.80430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882768.80449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.80497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882768.80509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882768.80627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882768.82297: stdout chunk (state=3): >>>/root <<< 28173 1726882768.82401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882768.82449: stderr chunk (state=3): >>><<< 28173 1726882768.82452: stdout chunk (state=3): >>><<< 28173 1726882768.82474: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882768.82483: _low_level_execute_command(): starting 28173 1726882768.82490: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882768.824737-29133-100792244482592 `" && echo ansible-tmp-1726882768.824737-29133-100792244482592="` echo /root/.ansible/tmp/ansible-tmp-1726882768.824737-29133-100792244482592 `" ) && sleep 0' 28173 1726882768.82922: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.82936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.82951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.82962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882768.82974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.83021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882768.83040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882768.83136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882768.85006: stdout chunk (state=3): >>>ansible-tmp-1726882768.824737-29133-100792244482592=/root/.ansible/tmp/ansible-tmp-1726882768.824737-29133-100792244482592 <<< 28173 1726882768.85123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882768.85173: stderr chunk (state=3): >>><<< 28173 1726882768.85176: stdout chunk (state=3): >>><<< 28173 1726882768.85190: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882768.824737-29133-100792244482592=/root/.ansible/tmp/ansible-tmp-1726882768.824737-29133-100792244482592 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882768.85221: variable 'ansible_module_compression' from source: unknown 28173 1726882768.85255: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28173 1726882768.85284: variable 'ansible_facts' from source: unknown 28173 1726882768.85345: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882768.824737-29133-100792244482592/AnsiballZ_network_connections.py 28173 1726882768.85445: Sending initial data 28173 1726882768.85454: Sent initial data (167 bytes) 28173 1726882768.86094: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.86098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.86133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.86141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.86144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.86192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882768.86196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882768.86300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882768.88029: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882768.88124: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882768.88226: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp3fogmi7g /root/.ansible/tmp/ansible-tmp-1726882768.824737-29133-100792244482592/AnsiballZ_network_connections.py <<< 28173 1726882768.88322: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882768.90188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882768.90300: stderr chunk (state=3): >>><<< 28173 1726882768.90303: stdout chunk (state=3): >>><<< 28173 1726882768.90306: done transferring module to remote 28173 1726882768.90308: _low_level_execute_command(): starting 28173 1726882768.90310: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882768.824737-29133-100792244482592/ /root/.ansible/tmp/ansible-tmp-1726882768.824737-29133-100792244482592/AnsiballZ_network_connections.py && sleep 0' 28173 1726882768.90772: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882768.90775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.90804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.90808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.90810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.90813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882768.90825: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882768.90827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.90882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882768.90885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882768.90992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882768.92803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882768.92844: stderr chunk (state=3): >>><<< 28173 1726882768.92848: stdout chunk (state=3): >>><<< 28173 1726882768.92859: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882768.92862: _low_level_execute_command(): starting 28173 1726882768.92907: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882768.824737-29133-100792244482592/AnsiballZ_network_connections.py && sleep 0' 28173 1726882768.93262: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.93268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882768.93301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.93304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882768.93307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882768.93355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882768.93359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882768.93470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882769.18342: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 (is-modified)\n[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28173 1726882769.19921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882769.19980: stderr chunk (state=3): >>><<< 28173 1726882769.19984: stdout chunk (state=3): >>><<< 28173 1726882769.20000: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 (is-modified)\n[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882769.20052: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26'], 'route': [{'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 'custom'}, {'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 'custom'}, {'network': '192.0.2.64', 'prefix': 26, 'gateway': '198.51.100.8', 'metric': 50, 'table': 'custom', 'src': '198.51.100.3'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882768.824737-29133-100792244482592/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882769.20058: _low_level_execute_command(): starting 28173 1726882769.20065: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882768.824737-29133-100792244482592/ > /dev/null 2>&1 && sleep 0' 28173 1726882769.20529: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.20542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.20562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882769.20584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.20627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882769.20638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882769.20744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882769.22529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882769.22576: stderr chunk (state=3): >>><<< 28173 1726882769.22579: stdout chunk (state=3): >>><<< 28173 1726882769.22591: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882769.22597: handler run complete 28173 1726882769.22634: attempt loop complete, returning result 28173 1726882769.22637: _execute() done 28173 1726882769.22640: dumping result to json 28173 1726882769.22645: done dumping result, returning 28173 1726882769.22654: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-926c-8928-000000000077] 28173 1726882769.22659: sending task result for task 0e448fcc-3ce9-926c-8928-000000000077 28173 1726882769.22776: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000077 28173 1726882769.22779: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 (is-modified) [005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied 28173 1726882769.22928: no more pending results, returning what we have 28173 1726882769.22931: results queue empty 28173 1726882769.22932: checking for any_errors_fatal 28173 1726882769.22938: done checking for any_errors_fatal 28173 1726882769.22938: checking for max_fail_percentage 28173 1726882769.22940: done checking for max_fail_percentage 28173 1726882769.22940: checking to see if all hosts have failed and the running result is not ok 28173 1726882769.22941: done checking to see if all hosts have failed 28173 1726882769.22942: getting the remaining hosts for this loop 28173 1726882769.22943: done getting the remaining hosts for this loop 28173 1726882769.22946: getting the next task for host managed_node2 28173 1726882769.22951: done getting next task for host managed_node2 28173 1726882769.22954: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28173 1726882769.22957: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882769.22969: getting variables 28173 1726882769.22971: in VariableManager get_vars() 28173 1726882769.23013: Calling all_inventory to load vars for managed_node2 28173 1726882769.23016: Calling groups_inventory to load vars for managed_node2 28173 1726882769.23018: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882769.23027: Calling all_plugins_play to load vars for managed_node2 28173 1726882769.23029: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882769.23032: Calling groups_plugins_play to load vars for managed_node2 28173 1726882769.23936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882769.24862: done with get_vars() 28173 1726882769.24880: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:39:29 -0400 (0:00:00.526) 0:00:22.413 ****** 28173 1726882769.24939: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28173 1726882769.25150: worker is 1 (out of 1 available) 28173 1726882769.25165: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28173 1726882769.25176: done queuing things up, now waiting for results queue to drain 28173 1726882769.25177: waiting for pending results... 28173 1726882769.25356: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28173 1726882769.25441: in run() - task 0e448fcc-3ce9-926c-8928-000000000078 28173 1726882769.25454: variable 'ansible_search_path' from source: unknown 28173 1726882769.25457: variable 'ansible_search_path' from source: unknown 28173 1726882769.25494: calling self._execute() 28173 1726882769.25569: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.25575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.25584: variable 'omit' from source: magic vars 28173 1726882769.25858: variable 'ansible_distribution_major_version' from source: facts 28173 1726882769.25872: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882769.25957: variable 'network_state' from source: role '' defaults 28173 1726882769.25966: Evaluated conditional (network_state != {}): False 28173 1726882769.25970: when evaluation is False, skipping this task 28173 1726882769.25973: _execute() done 28173 1726882769.25978: dumping result to json 28173 1726882769.25980: done dumping result, returning 28173 1726882769.25986: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-926c-8928-000000000078] 28173 1726882769.25992: sending task result for task 0e448fcc-3ce9-926c-8928-000000000078 28173 1726882769.26076: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000078 28173 1726882769.26079: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882769.26129: no more pending results, returning what we have 28173 1726882769.26133: results queue empty 28173 1726882769.26133: checking for any_errors_fatal 28173 1726882769.26144: done checking for any_errors_fatal 28173 1726882769.26145: checking for max_fail_percentage 28173 1726882769.26146: done checking for max_fail_percentage 28173 1726882769.26147: checking to see if all hosts have failed and the running result is not ok 28173 1726882769.26148: done checking to see if all hosts have failed 28173 1726882769.26149: getting the remaining hosts for this loop 28173 1726882769.26150: done getting the remaining hosts for this loop 28173 1726882769.26153: getting the next task for host managed_node2 28173 1726882769.26158: done getting next task for host managed_node2 28173 1726882769.26161: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28173 1726882769.26166: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882769.26181: getting variables 28173 1726882769.26182: in VariableManager get_vars() 28173 1726882769.26222: Calling all_inventory to load vars for managed_node2 28173 1726882769.26224: Calling groups_inventory to load vars for managed_node2 28173 1726882769.26226: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882769.26235: Calling all_plugins_play to load vars for managed_node2 28173 1726882769.26237: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882769.26239: Calling groups_plugins_play to load vars for managed_node2 28173 1726882769.27000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882769.28011: done with get_vars() 28173 1726882769.28026: done getting variables 28173 1726882769.28070: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:39:29 -0400 (0:00:00.031) 0:00:22.445 ****** 28173 1726882769.28093: entering _queue_task() for managed_node2/debug 28173 1726882769.28277: worker is 1 (out of 1 available) 28173 1726882769.28291: exiting _queue_task() for managed_node2/debug 28173 1726882769.28303: done queuing things up, now waiting for results queue to drain 28173 1726882769.28304: waiting for pending results... 28173 1726882769.28479: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28173 1726882769.28562: in run() - task 0e448fcc-3ce9-926c-8928-000000000079 28173 1726882769.28581: variable 'ansible_search_path' from source: unknown 28173 1726882769.28584: variable 'ansible_search_path' from source: unknown 28173 1726882769.28610: calling self._execute() 28173 1726882769.28680: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.28686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.28692: variable 'omit' from source: magic vars 28173 1726882769.28948: variable 'ansible_distribution_major_version' from source: facts 28173 1726882769.28959: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882769.28962: variable 'omit' from source: magic vars 28173 1726882769.29009: variable 'omit' from source: magic vars 28173 1726882769.29039: variable 'omit' from source: magic vars 28173 1726882769.29070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882769.29100: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882769.29114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882769.29130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882769.29144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882769.29169: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882769.29173: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.29176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.29245: Set connection var ansible_pipelining to False 28173 1726882769.29248: Set connection var ansible_shell_type to sh 28173 1726882769.29251: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882769.29258: Set connection var ansible_timeout to 10 28173 1726882769.29263: Set connection var ansible_shell_executable to /bin/sh 28173 1726882769.29269: Set connection var ansible_connection to ssh 28173 1726882769.29286: variable 'ansible_shell_executable' from source: unknown 28173 1726882769.29289: variable 'ansible_connection' from source: unknown 28173 1726882769.29293: variable 'ansible_module_compression' from source: unknown 28173 1726882769.29295: variable 'ansible_shell_type' from source: unknown 28173 1726882769.29297: variable 'ansible_shell_executable' from source: unknown 28173 1726882769.29299: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.29302: variable 'ansible_pipelining' from source: unknown 28173 1726882769.29304: variable 'ansible_timeout' from source: unknown 28173 1726882769.29309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.29408: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882769.29416: variable 'omit' from source: magic vars 28173 1726882769.29423: starting attempt loop 28173 1726882769.29426: running the handler 28173 1726882769.29521: variable '__network_connections_result' from source: set_fact 28173 1726882769.29572: handler run complete 28173 1726882769.29585: attempt loop complete, returning result 28173 1726882769.29588: _execute() done 28173 1726882769.29590: dumping result to json 28173 1726882769.29593: done dumping result, returning 28173 1726882769.29600: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-926c-8928-000000000079] 28173 1726882769.29605: sending task result for task 0e448fcc-3ce9-926c-8928-000000000079 28173 1726882769.29686: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000079 28173 1726882769.29689: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 (is-modified)", "[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied" ] } 28173 1726882769.29751: no more pending results, returning what we have 28173 1726882769.29754: results queue empty 28173 1726882769.29754: checking for any_errors_fatal 28173 1726882769.29758: done checking for any_errors_fatal 28173 1726882769.29759: checking for max_fail_percentage 28173 1726882769.29760: done checking for max_fail_percentage 28173 1726882769.29761: checking to see if all hosts have failed and the running result is not ok 28173 1726882769.29762: done checking to see if all hosts have failed 28173 1726882769.29763: getting the remaining hosts for this loop 28173 1726882769.29765: done getting the remaining hosts for this loop 28173 1726882769.29768: getting the next task for host managed_node2 28173 1726882769.29773: done getting next task for host managed_node2 28173 1726882769.29777: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28173 1726882769.29780: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882769.29789: getting variables 28173 1726882769.29790: in VariableManager get_vars() 28173 1726882769.29827: Calling all_inventory to load vars for managed_node2 28173 1726882769.29829: Calling groups_inventory to load vars for managed_node2 28173 1726882769.29830: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882769.29836: Calling all_plugins_play to load vars for managed_node2 28173 1726882769.29838: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882769.29840: Calling groups_plugins_play to load vars for managed_node2 28173 1726882769.30599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882769.31531: done with get_vars() 28173 1726882769.31547: done getting variables 28173 1726882769.31589: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:39:29 -0400 (0:00:00.035) 0:00:22.480 ****** 28173 1726882769.31614: entering _queue_task() for managed_node2/debug 28173 1726882769.31791: worker is 1 (out of 1 available) 28173 1726882769.31804: exiting _queue_task() for managed_node2/debug 28173 1726882769.31817: done queuing things up, now waiting for results queue to drain 28173 1726882769.31819: waiting for pending results... 28173 1726882769.31996: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28173 1726882769.32084: in run() - task 0e448fcc-3ce9-926c-8928-00000000007a 28173 1726882769.32092: variable 'ansible_search_path' from source: unknown 28173 1726882769.32095: variable 'ansible_search_path' from source: unknown 28173 1726882769.32122: calling self._execute() 28173 1726882769.32196: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.32199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.32208: variable 'omit' from source: magic vars 28173 1726882769.32458: variable 'ansible_distribution_major_version' from source: facts 28173 1726882769.32471: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882769.32479: variable 'omit' from source: magic vars 28173 1726882769.32519: variable 'omit' from source: magic vars 28173 1726882769.32544: variable 'omit' from source: magic vars 28173 1726882769.32578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882769.32603: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882769.32619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882769.32634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882769.32643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882769.32667: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882769.32673: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.32676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.32744: Set connection var ansible_pipelining to False 28173 1726882769.32747: Set connection var ansible_shell_type to sh 28173 1726882769.32753: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882769.32759: Set connection var ansible_timeout to 10 28173 1726882769.32765: Set connection var ansible_shell_executable to /bin/sh 28173 1726882769.32773: Set connection var ansible_connection to ssh 28173 1726882769.32788: variable 'ansible_shell_executable' from source: unknown 28173 1726882769.32790: variable 'ansible_connection' from source: unknown 28173 1726882769.32793: variable 'ansible_module_compression' from source: unknown 28173 1726882769.32795: variable 'ansible_shell_type' from source: unknown 28173 1726882769.32797: variable 'ansible_shell_executable' from source: unknown 28173 1726882769.32799: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.32804: variable 'ansible_pipelining' from source: unknown 28173 1726882769.32807: variable 'ansible_timeout' from source: unknown 28173 1726882769.32809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.32907: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882769.32915: variable 'omit' from source: magic vars 28173 1726882769.32920: starting attempt loop 28173 1726882769.32922: running the handler 28173 1726882769.32962: variable '__network_connections_result' from source: set_fact 28173 1726882769.33015: variable '__network_connections_result' from source: set_fact 28173 1726882769.33135: handler run complete 28173 1726882769.33159: attempt loop complete, returning result 28173 1726882769.33162: _execute() done 28173 1726882769.33166: dumping result to json 28173 1726882769.33173: done dumping result, returning 28173 1726882769.33183: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-926c-8928-00000000007a] 28173 1726882769.33188: sending task result for task 0e448fcc-3ce9-926c-8928-00000000007a 28173 1726882769.33281: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000007a 28173 1726882769.33284: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 (is-modified)\n[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9ea3c671-64f7-45cb-8f46-d70830299b65 (is-modified)", "[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied" ] } } 28173 1726882769.33386: no more pending results, returning what we have 28173 1726882769.33390: results queue empty 28173 1726882769.33391: checking for any_errors_fatal 28173 1726882769.33395: done checking for any_errors_fatal 28173 1726882769.33395: checking for max_fail_percentage 28173 1726882769.33397: done checking for max_fail_percentage 28173 1726882769.33402: checking to see if all hosts have failed and the running result is not ok 28173 1726882769.33403: done checking to see if all hosts have failed 28173 1726882769.33403: getting the remaining hosts for this loop 28173 1726882769.33404: done getting the remaining hosts for this loop 28173 1726882769.33407: getting the next task for host managed_node2 28173 1726882769.33412: done getting next task for host managed_node2 28173 1726882769.33415: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28173 1726882769.33417: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882769.33425: getting variables 28173 1726882769.33426: in VariableManager get_vars() 28173 1726882769.33450: Calling all_inventory to load vars for managed_node2 28173 1726882769.33451: Calling groups_inventory to load vars for managed_node2 28173 1726882769.33453: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882769.33459: Calling all_plugins_play to load vars for managed_node2 28173 1726882769.33460: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882769.33462: Calling groups_plugins_play to load vars for managed_node2 28173 1726882769.34316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882769.35245: done with get_vars() 28173 1726882769.35259: done getting variables 28173 1726882769.35300: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:39:29 -0400 (0:00:00.037) 0:00:22.517 ****** 28173 1726882769.35322: entering _queue_task() for managed_node2/debug 28173 1726882769.35497: worker is 1 (out of 1 available) 28173 1726882769.35509: exiting _queue_task() for managed_node2/debug 28173 1726882769.35521: done queuing things up, now waiting for results queue to drain 28173 1726882769.35522: waiting for pending results... 28173 1726882769.35690: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28173 1726882769.35774: in run() - task 0e448fcc-3ce9-926c-8928-00000000007b 28173 1726882769.35786: variable 'ansible_search_path' from source: unknown 28173 1726882769.35790: variable 'ansible_search_path' from source: unknown 28173 1726882769.35814: calling self._execute() 28173 1726882769.35883: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.35887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.35896: variable 'omit' from source: magic vars 28173 1726882769.36141: variable 'ansible_distribution_major_version' from source: facts 28173 1726882769.36150: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882769.36234: variable 'network_state' from source: role '' defaults 28173 1726882769.36243: Evaluated conditional (network_state != {}): False 28173 1726882769.36247: when evaluation is False, skipping this task 28173 1726882769.36249: _execute() done 28173 1726882769.36255: dumping result to json 28173 1726882769.36257: done dumping result, returning 28173 1726882769.36269: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-926c-8928-00000000007b] 28173 1726882769.36285: sending task result for task 0e448fcc-3ce9-926c-8928-00000000007b 28173 1726882769.36374: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000007b 28173 1726882769.36377: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28173 1726882769.36439: no more pending results, returning what we have 28173 1726882769.36443: results queue empty 28173 1726882769.36443: checking for any_errors_fatal 28173 1726882769.36451: done checking for any_errors_fatal 28173 1726882769.36452: checking for max_fail_percentage 28173 1726882769.36453: done checking for max_fail_percentage 28173 1726882769.36454: checking to see if all hosts have failed and the running result is not ok 28173 1726882769.36455: done checking to see if all hosts have failed 28173 1726882769.36455: getting the remaining hosts for this loop 28173 1726882769.36456: done getting the remaining hosts for this loop 28173 1726882769.36459: getting the next task for host managed_node2 28173 1726882769.36465: done getting next task for host managed_node2 28173 1726882769.36469: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28173 1726882769.36472: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882769.36486: getting variables 28173 1726882769.36487: in VariableManager get_vars() 28173 1726882769.36515: Calling all_inventory to load vars for managed_node2 28173 1726882769.36517: Calling groups_inventory to load vars for managed_node2 28173 1726882769.36518: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882769.36526: Calling all_plugins_play to load vars for managed_node2 28173 1726882769.36529: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882769.36531: Calling groups_plugins_play to load vars for managed_node2 28173 1726882769.37288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882769.38225: done with get_vars() 28173 1726882769.38241: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:39:29 -0400 (0:00:00.029) 0:00:22.547 ****** 28173 1726882769.38311: entering _queue_task() for managed_node2/ping 28173 1726882769.38505: worker is 1 (out of 1 available) 28173 1726882769.38518: exiting _queue_task() for managed_node2/ping 28173 1726882769.38530: done queuing things up, now waiting for results queue to drain 28173 1726882769.38531: waiting for pending results... 28173 1726882769.38718: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28173 1726882769.38802: in run() - task 0e448fcc-3ce9-926c-8928-00000000007c 28173 1726882769.38814: variable 'ansible_search_path' from source: unknown 28173 1726882769.38818: variable 'ansible_search_path' from source: unknown 28173 1726882769.38845: calling self._execute() 28173 1726882769.38918: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.38926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.38935: variable 'omit' from source: magic vars 28173 1726882769.39205: variable 'ansible_distribution_major_version' from source: facts 28173 1726882769.39215: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882769.39221: variable 'omit' from source: magic vars 28173 1726882769.39269: variable 'omit' from source: magic vars 28173 1726882769.39293: variable 'omit' from source: magic vars 28173 1726882769.39324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882769.39349: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882769.39369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882769.39381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882769.39393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882769.39417: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882769.39420: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.39423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.39493: Set connection var ansible_pipelining to False 28173 1726882769.39496: Set connection var ansible_shell_type to sh 28173 1726882769.39503: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882769.39510: Set connection var ansible_timeout to 10 28173 1726882769.39516: Set connection var ansible_shell_executable to /bin/sh 28173 1726882769.39521: Set connection var ansible_connection to ssh 28173 1726882769.39536: variable 'ansible_shell_executable' from source: unknown 28173 1726882769.39539: variable 'ansible_connection' from source: unknown 28173 1726882769.39542: variable 'ansible_module_compression' from source: unknown 28173 1726882769.39544: variable 'ansible_shell_type' from source: unknown 28173 1726882769.39546: variable 'ansible_shell_executable' from source: unknown 28173 1726882769.39549: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.39552: variable 'ansible_pipelining' from source: unknown 28173 1726882769.39555: variable 'ansible_timeout' from source: unknown 28173 1726882769.39559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.39706: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882769.39714: variable 'omit' from source: magic vars 28173 1726882769.39718: starting attempt loop 28173 1726882769.39721: running the handler 28173 1726882769.39732: _low_level_execute_command(): starting 28173 1726882769.39739: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882769.40255: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.40281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.40293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28173 1726882769.40306: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.40353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882769.40369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882769.40488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882769.42154: stdout chunk (state=3): >>>/root <<< 28173 1726882769.42335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882769.42345: stdout chunk (state=3): >>><<< 28173 1726882769.42357: stderr chunk (state=3): >>><<< 28173 1726882769.42383: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882769.42401: _low_level_execute_command(): starting 28173 1726882769.42411: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882769.4238942-29152-190577620237032 `" && echo ansible-tmp-1726882769.4238942-29152-190577620237032="` echo /root/.ansible/tmp/ansible-tmp-1726882769.4238942-29152-190577620237032 `" ) && sleep 0' 28173 1726882769.43030: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882769.43045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.43070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.43099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.43180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882769.43191: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882769.43217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.43240: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882769.43253: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882769.43282: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882769.43299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.43319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882769.43322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.43361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882769.43380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882769.43396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882769.43517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882769.45749: stdout chunk (state=3): >>>ansible-tmp-1726882769.4238942-29152-190577620237032=/root/.ansible/tmp/ansible-tmp-1726882769.4238942-29152-190577620237032 <<< 28173 1726882769.45766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882769.45831: stderr chunk (state=3): >>><<< 28173 1726882769.45842: stdout chunk (state=3): >>><<< 28173 1726882769.46131: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882769.4238942-29152-190577620237032=/root/.ansible/tmp/ansible-tmp-1726882769.4238942-29152-190577620237032 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882769.46135: variable 'ansible_module_compression' from source: unknown 28173 1726882769.46137: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28173 1726882769.46139: variable 'ansible_facts' from source: unknown 28173 1726882769.46141: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882769.4238942-29152-190577620237032/AnsiballZ_ping.py 28173 1726882769.46199: Sending initial data 28173 1726882769.46202: Sent initial data (153 bytes) 28173 1726882769.47096: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882769.47109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.47122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.47138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.47180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882769.47191: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882769.47204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.47220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882769.47231: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882769.47242: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882769.47255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.47270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.47286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.47298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882769.47309: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882769.47321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.47407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882769.47430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882769.47449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882769.47580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882769.49386: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882769.49486: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882769.49589: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpluod1wuj /root/.ansible/tmp/ansible-tmp-1726882769.4238942-29152-190577620237032/AnsiballZ_ping.py <<< 28173 1726882769.49690: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882769.51025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882769.51206: stderr chunk (state=3): >>><<< 28173 1726882769.51209: stdout chunk (state=3): >>><<< 28173 1726882769.51211: done transferring module to remote 28173 1726882769.51213: _low_level_execute_command(): starting 28173 1726882769.51215: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882769.4238942-29152-190577620237032/ /root/.ansible/tmp/ansible-tmp-1726882769.4238942-29152-190577620237032/AnsiballZ_ping.py && sleep 0' 28173 1726882769.51817: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882769.51831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.51846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.51882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.51931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882769.51954: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882769.51982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.52005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882769.52019: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882769.52034: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882769.52052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.52069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.52102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.52121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882769.52132: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882769.52151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.52241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882769.52271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882769.52295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882769.52435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882769.54213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882769.54282: stderr chunk (state=3): >>><<< 28173 1726882769.54285: stdout chunk (state=3): >>><<< 28173 1726882769.54371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882769.54374: _low_level_execute_command(): starting 28173 1726882769.54377: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882769.4238942-29152-190577620237032/AnsiballZ_ping.py && sleep 0' 28173 1726882769.54909: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882769.54922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.54936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.54952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.54994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882769.55006: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882769.55018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.55033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882769.55044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882769.55053: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882769.55065: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.55080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.55096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.55107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882769.55116: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882769.55127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.55204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882769.55224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882769.55237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882769.55371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882769.68246: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28173 1726882769.69357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882769.69372: stdout chunk (state=3): >>><<< 28173 1726882769.69375: stderr chunk (state=3): >>><<< 28173 1726882769.69521: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882769.69526: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882769.4238942-29152-190577620237032/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882769.69528: _low_level_execute_command(): starting 28173 1726882769.69530: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882769.4238942-29152-190577620237032/ > /dev/null 2>&1 && sleep 0' 28173 1726882769.70228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882769.70242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.70256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.70278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.70323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882769.70377: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882769.70398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.70447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882769.70473: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882769.70495: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882769.70517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.70571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.70590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.70603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882769.70630: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882769.70655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.70752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882769.70786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882769.70805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882769.70935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882769.72761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882769.72870: stderr chunk (state=3): >>><<< 28173 1726882769.72884: stdout chunk (state=3): >>><<< 28173 1726882769.73010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882769.73012: handler run complete 28173 1726882769.73014: attempt loop complete, returning result 28173 1726882769.73016: _execute() done 28173 1726882769.73018: dumping result to json 28173 1726882769.73020: done dumping result, returning 28173 1726882769.73021: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-926c-8928-00000000007c] 28173 1726882769.73023: sending task result for task 0e448fcc-3ce9-926c-8928-00000000007c 28173 1726882769.73128: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000007c ok: [managed_node2] => { "changed": false, "ping": "pong" } 28173 1726882769.73200: no more pending results, returning what we have 28173 1726882769.73204: results queue empty 28173 1726882769.73205: checking for any_errors_fatal 28173 1726882769.73211: done checking for any_errors_fatal 28173 1726882769.73212: checking for max_fail_percentage 28173 1726882769.73213: done checking for max_fail_percentage 28173 1726882769.73214: checking to see if all hosts have failed and the running result is not ok 28173 1726882769.73215: done checking to see if all hosts have failed 28173 1726882769.73216: getting the remaining hosts for this loop 28173 1726882769.73218: done getting the remaining hosts for this loop 28173 1726882769.73221: getting the next task for host managed_node2 28173 1726882769.73231: done getting next task for host managed_node2 28173 1726882769.73234: ^ task is: TASK: meta (role_complete) 28173 1726882769.73239: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882769.73252: getting variables 28173 1726882769.73254: in VariableManager get_vars() 28173 1726882769.73302: Calling all_inventory to load vars for managed_node2 28173 1726882769.73305: Calling groups_inventory to load vars for managed_node2 28173 1726882769.73308: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882769.73320: Calling all_plugins_play to load vars for managed_node2 28173 1726882769.73323: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882769.73327: Calling groups_plugins_play to load vars for managed_node2 28173 1726882769.73980: WORKER PROCESS EXITING 28173 1726882769.79461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882769.81894: done with get_vars() 28173 1726882769.81926: done getting variables 28173 1726882769.82028: done queuing things up, now waiting for results queue to drain 28173 1726882769.82030: results queue empty 28173 1726882769.82031: checking for any_errors_fatal 28173 1726882769.82033: done checking for any_errors_fatal 28173 1726882769.82033: checking for max_fail_percentage 28173 1726882769.82034: done checking for max_fail_percentage 28173 1726882769.82034: checking to see if all hosts have failed and the running result is not ok 28173 1726882769.82035: done checking to see if all hosts have failed 28173 1726882769.82035: getting the remaining hosts for this loop 28173 1726882769.82036: done getting the remaining hosts for this loop 28173 1726882769.82038: getting the next task for host managed_node2 28173 1726882769.82040: done getting next task for host managed_node2 28173 1726882769.82042: ^ task is: TASK: Get the routes from the named route table 'custom' 28173 1726882769.82043: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882769.82044: getting variables 28173 1726882769.82045: in VariableManager get_vars() 28173 1726882769.82055: Calling all_inventory to load vars for managed_node2 28173 1726882769.82056: Calling groups_inventory to load vars for managed_node2 28173 1726882769.82060: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882769.82068: Calling all_plugins_play to load vars for managed_node2 28173 1726882769.82073: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882769.82076: Calling groups_plugins_play to load vars for managed_node2 28173 1726882769.82760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882769.84251: done with get_vars() 28173 1726882769.84276: done getting variables 28173 1726882769.84331: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routes from the named route table 'custom'] ********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:121 Friday 20 September 2024 21:39:29 -0400 (0:00:00.460) 0:00:23.007 ****** 28173 1726882769.84355: entering _queue_task() for managed_node2/command 28173 1726882769.84758: worker is 1 (out of 1 available) 28173 1726882769.84774: exiting _queue_task() for managed_node2/command 28173 1726882769.84787: done queuing things up, now waiting for results queue to drain 28173 1726882769.84788: waiting for pending results... 28173 1726882769.85708: running TaskExecutor() for managed_node2/TASK: Get the routes from the named route table 'custom' 28173 1726882769.85810: in run() - task 0e448fcc-3ce9-926c-8928-0000000000ac 28173 1726882769.85973: variable 'ansible_search_path' from source: unknown 28173 1726882769.85977: calling self._execute() 28173 1726882769.85999: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.86003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.86012: variable 'omit' from source: magic vars 28173 1726882769.86370: variable 'ansible_distribution_major_version' from source: facts 28173 1726882769.86389: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882769.86399: variable 'omit' from source: magic vars 28173 1726882769.86422: variable 'omit' from source: magic vars 28173 1726882769.86459: variable 'omit' from source: magic vars 28173 1726882769.86506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882769.86542: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882769.86569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882769.86597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882769.86615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882769.86648: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882769.86657: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.86666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.86773: Set connection var ansible_pipelining to False 28173 1726882769.86783: Set connection var ansible_shell_type to sh 28173 1726882769.86797: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882769.86810: Set connection var ansible_timeout to 10 28173 1726882769.86819: Set connection var ansible_shell_executable to /bin/sh 28173 1726882769.86827: Set connection var ansible_connection to ssh 28173 1726882769.86851: variable 'ansible_shell_executable' from source: unknown 28173 1726882769.86858: variable 'ansible_connection' from source: unknown 28173 1726882769.86866: variable 'ansible_module_compression' from source: unknown 28173 1726882769.86873: variable 'ansible_shell_type' from source: unknown 28173 1726882769.86879: variable 'ansible_shell_executable' from source: unknown 28173 1726882769.86884: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882769.86891: variable 'ansible_pipelining' from source: unknown 28173 1726882769.86897: variable 'ansible_timeout' from source: unknown 28173 1726882769.86904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882769.87037: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882769.87051: variable 'omit' from source: magic vars 28173 1726882769.87059: starting attempt loop 28173 1726882769.87068: running the handler 28173 1726882769.87086: _low_level_execute_command(): starting 28173 1726882769.87098: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882769.87764: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882769.87791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.87795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.87816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882769.87849: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882769.87851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.87854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 28173 1726882769.87856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882769.87858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.87916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882769.87919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882769.88035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882769.89689: stdout chunk (state=3): >>>/root <<< 28173 1726882769.89793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882769.89871: stderr chunk (state=3): >>><<< 28173 1726882769.89889: stdout chunk (state=3): >>><<< 28173 1726882769.90006: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882769.90010: _low_level_execute_command(): starting 28173 1726882769.90013: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882769.8991575-29175-233451946080493 `" && echo ansible-tmp-1726882769.8991575-29175-233451946080493="` echo /root/.ansible/tmp/ansible-tmp-1726882769.8991575-29175-233451946080493 `" ) && sleep 0' 28173 1726882769.90665: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.90673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.90704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.90715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882769.90717: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.90776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882769.90800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882769.90935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882769.92785: stdout chunk (state=3): >>>ansible-tmp-1726882769.8991575-29175-233451946080493=/root/.ansible/tmp/ansible-tmp-1726882769.8991575-29175-233451946080493 <<< 28173 1726882769.92901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882769.92946: stderr chunk (state=3): >>><<< 28173 1726882769.92948: stdout chunk (state=3): >>><<< 28173 1726882769.92969: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882769.8991575-29175-233451946080493=/root/.ansible/tmp/ansible-tmp-1726882769.8991575-29175-233451946080493 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882769.92992: variable 'ansible_module_compression' from source: unknown 28173 1726882769.93028: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28173 1726882769.93056: variable 'ansible_facts' from source: unknown 28173 1726882769.93115: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882769.8991575-29175-233451946080493/AnsiballZ_command.py 28173 1726882769.93217: Sending initial data 28173 1726882769.93226: Sent initial data (156 bytes) 28173 1726882769.93850: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882769.93856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.93906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.93909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.93912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.93960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882769.93965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882769.94073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882769.95810: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 28173 1726882769.95814: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882769.95904: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882769.95999: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp6ry8j4yd /root/.ansible/tmp/ansible-tmp-1726882769.8991575-29175-233451946080493/AnsiballZ_command.py <<< 28173 1726882769.96093: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882769.97123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882769.97229: stderr chunk (state=3): >>><<< 28173 1726882769.97232: stdout chunk (state=3): >>><<< 28173 1726882769.97254: done transferring module to remote 28173 1726882769.97259: _low_level_execute_command(): starting 28173 1726882769.97305: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882769.8991575-29175-233451946080493/ /root/.ansible/tmp/ansible-tmp-1726882769.8991575-29175-233451946080493/AnsiballZ_command.py && sleep 0' 28173 1726882769.97787: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882769.97826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882769.97830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882769.97888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882769.97891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882769.97990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882769.99731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882769.99778: stderr chunk (state=3): >>><<< 28173 1726882769.99783: stdout chunk (state=3): >>><<< 28173 1726882769.99794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882769.99798: _low_level_execute_command(): starting 28173 1726882769.99803: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882769.8991575-29175-233451946080493/AnsiballZ_command.py && sleep 0' 28173 1726882770.00211: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.00215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.00246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.00250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.00252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.00306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882770.00310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882770.00417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882770.13906: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "custom"], "start": "2024-09-20 21:39:30.133518", "end": "2024-09-20 21:39:30.137040", "delta": "0:00:00.003522", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882770.15202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882770.15206: stdout chunk (state=3): >>><<< 28173 1726882770.15211: stderr chunk (state=3): >>><<< 28173 1726882770.15232: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "custom"], "start": "2024-09-20 21:39:30.133518", "end": "2024-09-20 21:39:30.137040", "delta": "0:00:00.003522", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882770.15273: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route show table custom', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882769.8991575-29175-233451946080493/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882770.15280: _low_level_execute_command(): starting 28173 1726882770.15286: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882769.8991575-29175-233451946080493/ > /dev/null 2>&1 && sleep 0' 28173 1726882770.15911: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882770.15919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.15929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.15944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.15981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882770.15988: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882770.15999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.16013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882770.16020: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882770.16026: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882770.16034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.16043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.16054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.16061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882770.16071: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882770.16080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.16151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882770.16172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882770.16181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882770.16306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882770.18192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882770.18195: stdout chunk (state=3): >>><<< 28173 1726882770.18197: stderr chunk (state=3): >>><<< 28173 1726882770.18374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882770.18377: handler run complete 28173 1726882770.18380: Evaluated conditional (False): False 28173 1726882770.18382: attempt loop complete, returning result 28173 1726882770.18384: _execute() done 28173 1726882770.18386: dumping result to json 28173 1726882770.18388: done dumping result, returning 28173 1726882770.18390: done running TaskExecutor() for managed_node2/TASK: Get the routes from the named route table 'custom' [0e448fcc-3ce9-926c-8928-0000000000ac] 28173 1726882770.18393: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ac 28173 1726882770.18477: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ac 28173 1726882770.18480: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "route", "show", "table", "custom" ], "delta": "0:00:00.003522", "end": "2024-09-20 21:39:30.137040", "rc": 0, "start": "2024-09-20 21:39:30.133518" } STDOUT: 192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 28173 1726882770.18568: no more pending results, returning what we have 28173 1726882770.18572: results queue empty 28173 1726882770.18573: checking for any_errors_fatal 28173 1726882770.18575: done checking for any_errors_fatal 28173 1726882770.18576: checking for max_fail_percentage 28173 1726882770.18578: done checking for max_fail_percentage 28173 1726882770.18579: checking to see if all hosts have failed and the running result is not ok 28173 1726882770.18580: done checking to see if all hosts have failed 28173 1726882770.18581: getting the remaining hosts for this loop 28173 1726882770.18583: done getting the remaining hosts for this loop 28173 1726882770.18587: getting the next task for host managed_node2 28173 1726882770.18595: done getting next task for host managed_node2 28173 1726882770.18598: ^ task is: TASK: Assert that the named route table 'custom' contains the specified route 28173 1726882770.18601: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882770.18605: getting variables 28173 1726882770.18607: in VariableManager get_vars() 28173 1726882770.18658: Calling all_inventory to load vars for managed_node2 28173 1726882770.18661: Calling groups_inventory to load vars for managed_node2 28173 1726882770.18665: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882770.18679: Calling all_plugins_play to load vars for managed_node2 28173 1726882770.18682: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882770.18685: Calling groups_plugins_play to load vars for managed_node2 28173 1726882770.20487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882770.22327: done with get_vars() 28173 1726882770.22350: done getting variables 28173 1726882770.22419: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the named route table 'custom' contains the specified route] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:127 Friday 20 September 2024 21:39:30 -0400 (0:00:00.380) 0:00:23.388 ****** 28173 1726882770.22449: entering _queue_task() for managed_node2/assert 28173 1726882770.22771: worker is 1 (out of 1 available) 28173 1726882770.22782: exiting _queue_task() for managed_node2/assert 28173 1726882770.22803: done queuing things up, now waiting for results queue to drain 28173 1726882770.22805: waiting for pending results... 28173 1726882770.23106: running TaskExecutor() for managed_node2/TASK: Assert that the named route table 'custom' contains the specified route 28173 1726882770.23216: in run() - task 0e448fcc-3ce9-926c-8928-0000000000ad 28173 1726882770.23242: variable 'ansible_search_path' from source: unknown 28173 1726882770.23291: calling self._execute() 28173 1726882770.23407: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882770.23419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882770.23437: variable 'omit' from source: magic vars 28173 1726882770.23858: variable 'ansible_distribution_major_version' from source: facts 28173 1726882770.23882: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882770.23894: variable 'omit' from source: magic vars 28173 1726882770.24276: variable 'omit' from source: magic vars 28173 1726882770.24311: variable 'omit' from source: magic vars 28173 1726882770.24347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882770.24383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882770.24401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882770.24418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882770.24430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882770.24457: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882770.24460: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882770.24463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882770.24561: Set connection var ansible_pipelining to False 28173 1726882770.24567: Set connection var ansible_shell_type to sh 28173 1726882770.24576: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882770.24584: Set connection var ansible_timeout to 10 28173 1726882770.24590: Set connection var ansible_shell_executable to /bin/sh 28173 1726882770.24595: Set connection var ansible_connection to ssh 28173 1726882770.24617: variable 'ansible_shell_executable' from source: unknown 28173 1726882770.24620: variable 'ansible_connection' from source: unknown 28173 1726882770.24629: variable 'ansible_module_compression' from source: unknown 28173 1726882770.24632: variable 'ansible_shell_type' from source: unknown 28173 1726882770.24635: variable 'ansible_shell_executable' from source: unknown 28173 1726882770.24637: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882770.24639: variable 'ansible_pipelining' from source: unknown 28173 1726882770.24641: variable 'ansible_timeout' from source: unknown 28173 1726882770.24643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882770.24779: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882770.24788: variable 'omit' from source: magic vars 28173 1726882770.24794: starting attempt loop 28173 1726882770.24797: running the handler 28173 1726882770.24951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882770.25184: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882770.25223: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882770.25285: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882770.25316: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882770.25397: variable 'route_table_custom' from source: set_fact 28173 1726882770.25425: Evaluated conditional (route_table_custom.stdout is search("198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2")): True 28173 1726882770.25560: variable 'route_table_custom' from source: set_fact 28173 1726882770.25589: Evaluated conditional (route_table_custom.stdout is search("198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4")): True 28173 1726882770.25716: variable 'route_table_custom' from source: set_fact 28173 1726882770.25742: Evaluated conditional (route_table_custom.stdout is search("192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50")): True 28173 1726882770.25747: handler run complete 28173 1726882770.25761: attempt loop complete, returning result 28173 1726882770.25765: _execute() done 28173 1726882770.25768: dumping result to json 28173 1726882770.25771: done dumping result, returning 28173 1726882770.25780: done running TaskExecutor() for managed_node2/TASK: Assert that the named route table 'custom' contains the specified route [0e448fcc-3ce9-926c-8928-0000000000ad] 28173 1726882770.25786: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ad 28173 1726882770.25876: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ad 28173 1726882770.25879: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28173 1726882770.25954: no more pending results, returning what we have 28173 1726882770.25958: results queue empty 28173 1726882770.25958: checking for any_errors_fatal 28173 1726882770.25967: done checking for any_errors_fatal 28173 1726882770.25968: checking for max_fail_percentage 28173 1726882770.25970: done checking for max_fail_percentage 28173 1726882770.25971: checking to see if all hosts have failed and the running result is not ok 28173 1726882770.25972: done checking to see if all hosts have failed 28173 1726882770.25972: getting the remaining hosts for this loop 28173 1726882770.25974: done getting the remaining hosts for this loop 28173 1726882770.25978: getting the next task for host managed_node2 28173 1726882770.25982: done getting next task for host managed_node2 28173 1726882770.25985: ^ task is: TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 28173 1726882770.25988: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882770.25991: getting variables 28173 1726882770.25992: in VariableManager get_vars() 28173 1726882770.26027: Calling all_inventory to load vars for managed_node2 28173 1726882770.26029: Calling groups_inventory to load vars for managed_node2 28173 1726882770.26031: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882770.26040: Calling all_plugins_play to load vars for managed_node2 28173 1726882770.26043: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882770.26046: Calling groups_plugins_play to load vars for managed_node2 28173 1726882770.27619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882770.29331: done with get_vars() 28173 1726882770.29356: done getting variables TASK [Remove the dedicated test file in `/etc/iproute2/rt_tables.d/`] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:135 Friday 20 September 2024 21:39:30 -0400 (0:00:00.070) 0:00:23.459 ****** 28173 1726882770.29463: entering _queue_task() for managed_node2/file 28173 1726882770.29784: worker is 1 (out of 1 available) 28173 1726882770.29798: exiting _queue_task() for managed_node2/file 28173 1726882770.29811: done queuing things up, now waiting for results queue to drain 28173 1726882770.29813: waiting for pending results... 28173 1726882770.30116: running TaskExecutor() for managed_node2/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 28173 1726882770.30226: in run() - task 0e448fcc-3ce9-926c-8928-0000000000ae 28173 1726882770.30250: variable 'ansible_search_path' from source: unknown 28173 1726882770.30300: calling self._execute() 28173 1726882770.30414: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882770.30425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882770.30441: variable 'omit' from source: magic vars 28173 1726882770.30862: variable 'ansible_distribution_major_version' from source: facts 28173 1726882770.30881: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882770.30891: variable 'omit' from source: magic vars 28173 1726882770.30916: variable 'omit' from source: magic vars 28173 1726882770.30956: variable 'omit' from source: magic vars 28173 1726882770.30999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882770.31038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882770.31066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882770.31087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882770.31102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882770.31135: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882770.31143: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882770.31149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882770.31253: Set connection var ansible_pipelining to False 28173 1726882770.31263: Set connection var ansible_shell_type to sh 28173 1726882770.31281: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882770.31292: Set connection var ansible_timeout to 10 28173 1726882770.31300: Set connection var ansible_shell_executable to /bin/sh 28173 1726882770.31308: Set connection var ansible_connection to ssh 28173 1726882770.31331: variable 'ansible_shell_executable' from source: unknown 28173 1726882770.31341: variable 'ansible_connection' from source: unknown 28173 1726882770.31347: variable 'ansible_module_compression' from source: unknown 28173 1726882770.31353: variable 'ansible_shell_type' from source: unknown 28173 1726882770.31359: variable 'ansible_shell_executable' from source: unknown 28173 1726882770.31366: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882770.31376: variable 'ansible_pipelining' from source: unknown 28173 1726882770.31384: variable 'ansible_timeout' from source: unknown 28173 1726882770.31391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882770.31596: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882770.31614: variable 'omit' from source: magic vars 28173 1726882770.31623: starting attempt loop 28173 1726882770.31628: running the handler 28173 1726882770.31641: _low_level_execute_command(): starting 28173 1726882770.31650: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882770.32433: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882770.32449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.32468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.32493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.32539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882770.32554: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882770.32572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.32595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882770.32609: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882770.32621: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882770.32634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.32651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.32672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.32687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882770.32701: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882770.32719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.32799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882770.32827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882770.32845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882770.32995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882770.34629: stdout chunk (state=3): >>>/root <<< 28173 1726882770.34731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882770.34811: stderr chunk (state=3): >>><<< 28173 1726882770.34822: stdout chunk (state=3): >>><<< 28173 1726882770.34870: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882770.34874: _low_level_execute_command(): starting 28173 1726882770.34948: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882770.3485117-29196-111523595458894 `" && echo ansible-tmp-1726882770.3485117-29196-111523595458894="` echo /root/.ansible/tmp/ansible-tmp-1726882770.3485117-29196-111523595458894 `" ) && sleep 0' 28173 1726882770.35529: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882770.35546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.35560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.35580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.35620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882770.35631: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882770.35650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.35668: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882770.35681: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882770.35690: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882770.35702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.35714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.35727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.35738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882770.35747: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882770.35767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.35843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882770.35873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882770.35892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882770.36022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882770.37891: stdout chunk (state=3): >>>ansible-tmp-1726882770.3485117-29196-111523595458894=/root/.ansible/tmp/ansible-tmp-1726882770.3485117-29196-111523595458894 <<< 28173 1726882770.38083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882770.38087: stdout chunk (state=3): >>><<< 28173 1726882770.38089: stderr chunk (state=3): >>><<< 28173 1726882770.38170: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882770.3485117-29196-111523595458894=/root/.ansible/tmp/ansible-tmp-1726882770.3485117-29196-111523595458894 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882770.38174: variable 'ansible_module_compression' from source: unknown 28173 1726882770.38483: ANSIBALLZ: Using lock for file 28173 1726882770.38486: ANSIBALLZ: Acquiring lock 28173 1726882770.38488: ANSIBALLZ: Lock acquired: 140243976270816 28173 1726882770.38490: ANSIBALLZ: Creating module 28173 1726882770.55595: ANSIBALLZ: Writing module into payload 28173 1726882770.55808: ANSIBALLZ: Writing module 28173 1726882770.55834: ANSIBALLZ: Renaming module 28173 1726882770.55842: ANSIBALLZ: Done creating module 28173 1726882770.55865: variable 'ansible_facts' from source: unknown 28173 1726882770.55960: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882770.3485117-29196-111523595458894/AnsiballZ_file.py 28173 1726882770.56110: Sending initial data 28173 1726882770.56113: Sent initial data (153 bytes) 28173 1726882770.57128: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882770.57138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.57156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.57176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.57214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882770.57221: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882770.57231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.57245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882770.57256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882770.57264: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882770.57277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.57290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.57302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.57309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882770.57316: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882770.57326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.57404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882770.57421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882770.57434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882770.57573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882770.59401: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882770.59497: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882770.59600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp6z53aunw /root/.ansible/tmp/ansible-tmp-1726882770.3485117-29196-111523595458894/AnsiballZ_file.py <<< 28173 1726882770.59695: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882770.61392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882770.61770: stderr chunk (state=3): >>><<< 28173 1726882770.61773: stdout chunk (state=3): >>><<< 28173 1726882770.61776: done transferring module to remote 28173 1726882770.61778: _low_level_execute_command(): starting 28173 1726882770.61781: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882770.3485117-29196-111523595458894/ /root/.ansible/tmp/ansible-tmp-1726882770.3485117-29196-111523595458894/AnsiballZ_file.py && sleep 0' 28173 1726882770.63484: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.63490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.63676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.63682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.63699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882770.63705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.63899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882770.63917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882770.64045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882770.65917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882770.65921: stderr chunk (state=3): >>><<< 28173 1726882770.65924: stdout chunk (state=3): >>><<< 28173 1726882770.65945: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882770.65949: _low_level_execute_command(): starting 28173 1726882770.65953: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882770.3485117-29196-111523595458894/AnsiballZ_file.py && sleep 0' 28173 1726882770.66922: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882770.66936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.66950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.66973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.67020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882770.67034: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882770.67049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.67069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882770.67084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882770.67095: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882770.67107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.67121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.67141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.67152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882770.67162: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882770.67177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.67254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882770.67280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882770.67297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882770.67434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882770.81086: stdout chunk (state=3): >>> {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 28173 1726882770.82186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882770.82190: stdout chunk (state=3): >>><<< 28173 1726882770.82193: stderr chunk (state=3): >>><<< 28173 1726882770.82270: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882770.82280: done with _execute_module (file, {'state': 'absent', 'path': '/etc/iproute2/rt_tables.d/table.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882770.3485117-29196-111523595458894/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882770.82283: _low_level_execute_command(): starting 28173 1726882770.82286: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882770.3485117-29196-111523595458894/ > /dev/null 2>&1 && sleep 0' 28173 1726882770.82914: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882770.82928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.82945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.82957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.82997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882770.83004: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882770.83014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.83026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882770.83033: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882770.83040: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882770.83048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882770.83057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882770.83109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882770.83118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882770.83125: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882770.83135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882770.83241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882770.83258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882770.83315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882770.83441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882770.85335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882770.85341: stdout chunk (state=3): >>><<< 28173 1726882770.85370: stderr chunk (state=3): >>><<< 28173 1726882770.85373: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882770.85376: handler run complete 28173 1726882770.85399: attempt loop complete, returning result 28173 1726882770.85402: _execute() done 28173 1726882770.85404: dumping result to json 28173 1726882770.85409: done dumping result, returning 28173 1726882770.85418: done running TaskExecutor() for managed_node2/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` [0e448fcc-3ce9-926c-8928-0000000000ae] 28173 1726882770.85424: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ae 28173 1726882770.85533: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ae 28173 1726882770.85536: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent" } 28173 1726882770.85603: no more pending results, returning what we have 28173 1726882770.85605: results queue empty 28173 1726882770.85606: checking for any_errors_fatal 28173 1726882770.85613: done checking for any_errors_fatal 28173 1726882770.85614: checking for max_fail_percentage 28173 1726882770.85615: done checking for max_fail_percentage 28173 1726882770.85616: checking to see if all hosts have failed and the running result is not ok 28173 1726882770.85617: done checking to see if all hosts have failed 28173 1726882770.85618: getting the remaining hosts for this loop 28173 1726882770.85619: done getting the remaining hosts for this loop 28173 1726882770.85623: getting the next task for host managed_node2 28173 1726882770.85630: done getting next task for host managed_node2 28173 1726882770.85632: ^ task is: TASK: meta (flush_handlers) 28173 1726882770.85634: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882770.85638: getting variables 28173 1726882770.85639: in VariableManager get_vars() 28173 1726882770.85692: Calling all_inventory to load vars for managed_node2 28173 1726882770.85695: Calling groups_inventory to load vars for managed_node2 28173 1726882770.85697: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882770.85707: Calling all_plugins_play to load vars for managed_node2 28173 1726882770.85709: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882770.85712: Calling groups_plugins_play to load vars for managed_node2 28173 1726882770.87640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882770.89763: done with get_vars() 28173 1726882770.89792: done getting variables 28173 1726882770.89871: in VariableManager get_vars() 28173 1726882770.89888: Calling all_inventory to load vars for managed_node2 28173 1726882770.89890: Calling groups_inventory to load vars for managed_node2 28173 1726882770.89893: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882770.89898: Calling all_plugins_play to load vars for managed_node2 28173 1726882770.89900: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882770.89903: Calling groups_plugins_play to load vars for managed_node2 28173 1726882770.91959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882770.94737: done with get_vars() 28173 1726882770.94774: done queuing things up, now waiting for results queue to drain 28173 1726882770.94776: results queue empty 28173 1726882770.94777: checking for any_errors_fatal 28173 1726882770.94781: done checking for any_errors_fatal 28173 1726882770.94782: checking for max_fail_percentage 28173 1726882770.94783: done checking for max_fail_percentage 28173 1726882770.94783: checking to see if all hosts have failed and the running result is not ok 28173 1726882770.94784: done checking to see if all hosts have failed 28173 1726882770.94785: getting the remaining hosts for this loop 28173 1726882770.94786: done getting the remaining hosts for this loop 28173 1726882770.94789: getting the next task for host managed_node2 28173 1726882770.94793: done getting next task for host managed_node2 28173 1726882770.94794: ^ task is: TASK: meta (flush_handlers) 28173 1726882770.94796: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882770.94798: getting variables 28173 1726882770.94799: in VariableManager get_vars() 28173 1726882770.94818: Calling all_inventory to load vars for managed_node2 28173 1726882770.94820: Calling groups_inventory to load vars for managed_node2 28173 1726882770.94822: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882770.94828: Calling all_plugins_play to load vars for managed_node2 28173 1726882770.94830: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882770.94833: Calling groups_plugins_play to load vars for managed_node2 28173 1726882770.96210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882770.98045: done with get_vars() 28173 1726882770.98073: done getting variables 28173 1726882770.98124: in VariableManager get_vars() 28173 1726882770.98139: Calling all_inventory to load vars for managed_node2 28173 1726882770.98142: Calling groups_inventory to load vars for managed_node2 28173 1726882770.98144: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882770.98149: Calling all_plugins_play to load vars for managed_node2 28173 1726882770.98151: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882770.98154: Calling groups_plugins_play to load vars for managed_node2 28173 1726882770.99453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882771.01293: done with get_vars() 28173 1726882771.01318: done queuing things up, now waiting for results queue to drain 28173 1726882771.01320: results queue empty 28173 1726882771.01321: checking for any_errors_fatal 28173 1726882771.01322: done checking for any_errors_fatal 28173 1726882771.01323: checking for max_fail_percentage 28173 1726882771.01324: done checking for max_fail_percentage 28173 1726882771.01325: checking to see if all hosts have failed and the running result is not ok 28173 1726882771.01325: done checking to see if all hosts have failed 28173 1726882771.01330: getting the remaining hosts for this loop 28173 1726882771.01331: done getting the remaining hosts for this loop 28173 1726882771.01334: getting the next task for host managed_node2 28173 1726882771.01337: done getting next task for host managed_node2 28173 1726882771.01338: ^ task is: None 28173 1726882771.01339: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882771.01340: done queuing things up, now waiting for results queue to drain 28173 1726882771.01341: results queue empty 28173 1726882771.01342: checking for any_errors_fatal 28173 1726882771.01342: done checking for any_errors_fatal 28173 1726882771.01343: checking for max_fail_percentage 28173 1726882771.01344: done checking for max_fail_percentage 28173 1726882771.01344: checking to see if all hosts have failed and the running result is not ok 28173 1726882771.01345: done checking to see if all hosts have failed 28173 1726882771.01347: getting the next task for host managed_node2 28173 1726882771.01349: done getting next task for host managed_node2 28173 1726882771.01350: ^ task is: None 28173 1726882771.01351: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882771.01411: in VariableManager get_vars() 28173 1726882771.01430: done with get_vars() 28173 1726882771.01438: in VariableManager get_vars() 28173 1726882771.01450: done with get_vars() 28173 1726882771.01454: variable 'omit' from source: magic vars 28173 1726882771.01576: variable 'profile' from source: play vars 28173 1726882771.01698: in VariableManager get_vars() 28173 1726882771.01712: done with get_vars() 28173 1726882771.01731: variable 'omit' from source: magic vars 28173 1726882771.01803: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 28173 1726882771.02549: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28173 1726882771.02576: getting the remaining hosts for this loop 28173 1726882771.02577: done getting the remaining hosts for this loop 28173 1726882771.02580: getting the next task for host managed_node2 28173 1726882771.02582: done getting next task for host managed_node2 28173 1726882771.02584: ^ task is: TASK: Gathering Facts 28173 1726882771.02586: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882771.02588: getting variables 28173 1726882771.02589: in VariableManager get_vars() 28173 1726882771.02659: Calling all_inventory to load vars for managed_node2 28173 1726882771.02662: Calling groups_inventory to load vars for managed_node2 28173 1726882771.02668: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882771.02674: Calling all_plugins_play to load vars for managed_node2 28173 1726882771.02677: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882771.02680: Calling groups_plugins_play to load vars for managed_node2 28173 1726882771.04218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882771.06134: done with get_vars() 28173 1726882771.06155: done getting variables 28173 1726882771.06201: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 21:39:31 -0400 (0:00:00.767) 0:00:24.226 ****** 28173 1726882771.06225: entering _queue_task() for managed_node2/gather_facts 28173 1726882771.06529: worker is 1 (out of 1 available) 28173 1726882771.06540: exiting _queue_task() for managed_node2/gather_facts 28173 1726882771.06549: done queuing things up, now waiting for results queue to drain 28173 1726882771.06551: waiting for pending results... 28173 1726882771.06826: running TaskExecutor() for managed_node2/TASK: Gathering Facts 28173 1726882771.06934: in run() - task 0e448fcc-3ce9-926c-8928-0000000006a2 28173 1726882771.06954: variable 'ansible_search_path' from source: unknown 28173 1726882771.06998: calling self._execute() 28173 1726882771.07094: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882771.07106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882771.07117: variable 'omit' from source: magic vars 28173 1726882771.07472: variable 'ansible_distribution_major_version' from source: facts 28173 1726882771.07489: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882771.07500: variable 'omit' from source: magic vars 28173 1726882771.07530: variable 'omit' from source: magic vars 28173 1726882771.07570: variable 'omit' from source: magic vars 28173 1726882771.07614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882771.07654: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882771.07679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882771.07700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882771.07717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882771.07750: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882771.07759: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882771.07769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882771.07871: Set connection var ansible_pipelining to False 28173 1726882771.07880: Set connection var ansible_shell_type to sh 28173 1726882771.07893: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882771.07904: Set connection var ansible_timeout to 10 28173 1726882771.07913: Set connection var ansible_shell_executable to /bin/sh 28173 1726882771.07921: Set connection var ansible_connection to ssh 28173 1726882771.07945: variable 'ansible_shell_executable' from source: unknown 28173 1726882771.07952: variable 'ansible_connection' from source: unknown 28173 1726882771.07958: variable 'ansible_module_compression' from source: unknown 28173 1726882771.07966: variable 'ansible_shell_type' from source: unknown 28173 1726882771.07975: variable 'ansible_shell_executable' from source: unknown 28173 1726882771.07981: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882771.07988: variable 'ansible_pipelining' from source: unknown 28173 1726882771.07993: variable 'ansible_timeout' from source: unknown 28173 1726882771.08000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882771.08172: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882771.08193: variable 'omit' from source: magic vars 28173 1726882771.08203: starting attempt loop 28173 1726882771.08209: running the handler 28173 1726882771.08227: variable 'ansible_facts' from source: unknown 28173 1726882771.08249: _low_level_execute_command(): starting 28173 1726882771.08260: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882771.09011: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882771.09025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882771.09039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882771.09058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882771.09101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882771.09115: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882771.09130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882771.09151: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882771.09167: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882771.09183: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882771.09197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882771.09211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882771.09226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882771.09238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882771.09251: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882771.09267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882771.09345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882771.09362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882771.09379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882771.09532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882771.11199: stdout chunk (state=3): >>>/root <<< 28173 1726882771.11305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882771.11394: stderr chunk (state=3): >>><<< 28173 1726882771.11406: stdout chunk (state=3): >>><<< 28173 1726882771.11529: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882771.11533: _low_level_execute_command(): starting 28173 1726882771.11536: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882771.114399-29226-63744026022630 `" && echo ansible-tmp-1726882771.114399-29226-63744026022630="` echo /root/.ansible/tmp/ansible-tmp-1726882771.114399-29226-63744026022630 `" ) && sleep 0' 28173 1726882771.12225: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882771.12246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882771.12272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882771.12296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882771.12378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882771.12392: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882771.12788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882771.12801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882771.12931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882771.15000: stdout chunk (state=3): >>>ansible-tmp-1726882771.114399-29226-63744026022630=/root/.ansible/tmp/ansible-tmp-1726882771.114399-29226-63744026022630 <<< 28173 1726882771.15027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882771.15030: stdout chunk (state=3): >>><<< 28173 1726882771.15033: stderr chunk (state=3): >>><<< 28173 1726882771.15272: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882771.114399-29226-63744026022630=/root/.ansible/tmp/ansible-tmp-1726882771.114399-29226-63744026022630 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882771.15276: variable 'ansible_module_compression' from source: unknown 28173 1726882771.15279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28173 1726882771.15281: variable 'ansible_facts' from source: unknown 28173 1726882771.15399: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882771.114399-29226-63744026022630/AnsiballZ_setup.py 28173 1726882771.15978: Sending initial data 28173 1726882771.15981: Sent initial data (152 bytes) 28173 1726882771.17273: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882771.17288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882771.17309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882771.17327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882771.17373: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882771.17386: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882771.17401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882771.17423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882771.17437: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882771.17451: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882771.17458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882771.17471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882771.17485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882771.17493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882771.17499: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882771.17509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882771.17586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882771.17604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882771.17616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882771.17743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882771.19573: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882771.19678: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882771.19791: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp9i33_zf5 /root/.ansible/tmp/ansible-tmp-1726882771.114399-29226-63744026022630/AnsiballZ_setup.py <<< 28173 1726882771.19894: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882771.22882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882771.23027: stderr chunk (state=3): >>><<< 28173 1726882771.23030: stdout chunk (state=3): >>><<< 28173 1726882771.23057: done transferring module to remote 28173 1726882771.23071: _low_level_execute_command(): starting 28173 1726882771.23078: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882771.114399-29226-63744026022630/ /root/.ansible/tmp/ansible-tmp-1726882771.114399-29226-63744026022630/AnsiballZ_setup.py && sleep 0' 28173 1726882771.24308: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882771.24311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882771.24350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882771.24354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882771.24357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882771.24414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882771.24887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882771.24893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882771.24997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882771.26791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882771.26878: stderr chunk (state=3): >>><<< 28173 1726882771.26882: stdout chunk (state=3): >>><<< 28173 1726882771.26978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882771.26982: _low_level_execute_command(): starting 28173 1726882771.26985: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882771.114399-29226-63744026022630/AnsiballZ_setup.py && sleep 0' 28173 1726882771.28455: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882771.28473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882771.28486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882771.28504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882771.28547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882771.28629: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882771.28642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882771.28657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882771.28671: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882771.28682: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882771.28692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882771.28703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882771.28717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882771.28736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882771.28746: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882771.28757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882771.28843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882771.28969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882771.28984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882771.29190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882771.83702: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlz<<< 28173 1726882771.83741: stdout chunk (state=3): >>>dHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2801, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 731, "free": 2801}, "nocache": {"free": 3265, "used": 267}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 710, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238444544, "block_size": 4096, "block_total": 65519355, "block_available": 64511339, "block_used": 1008016, "inode_total": 131071472, "inode_available": 130998691, "inode_used": 72781, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"yea<<< 28173 1726882771.83768: stdout chunk (state=3): >>>r": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "31", "epoch": "1726882771", "epoch_int": "1726882771", "date": "2024-09-20", "time": "21:39:31", "iso8601_micro": "2024-09-21T01:39:31.760704Z", "iso8601": "2024-09-21T01:39:31Z", "iso8601_basic": "20240920T213931760704", "iso8601_basic_short": "20240920T213931", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "ethtest0", "peerethtest0", "rpltstbr", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "92:35:3e:53:1a:d5", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::9035:3eff:fe53:1ad5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ce:7d:c7:1b:e6:34", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "fe80::efc8:b960:f816:d9e0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address"<<< 28173 1726882771.83778: stdout chunk (state=3): >>>: "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "<<< 28173 1726882771.83793: stdout chunk (state=3): >>>tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "198.51.100.3", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::9035:3eff:fe53:1ad5", "fe80::efc8:b960:f816:d9e0", "fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72", "198.51.100.3"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1", "fe80::9035:3eff:fe53:1ad5", "fe80::efc8:b960:f816:d9e0"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.42, "5m": 0.41, "15m": 0.25}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28173 1726882771.85471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882771.85478: stdout chunk (state=3): >>><<< 28173 1726882771.85496: stderr chunk (state=3): >>><<< 28173 1726882771.85676: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2801, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 731, "free": 2801}, "nocache": {"free": 3265, "used": 267}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 710, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238444544, "block_size": 4096, "block_total": 65519355, "block_available": 64511339, "block_used": 1008016, "inode_total": 131071472, "inode_available": 130998691, "inode_used": 72781, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "31", "epoch": "1726882771", "epoch_int": "1726882771", "date": "2024-09-20", "time": "21:39:31", "iso8601_micro": "2024-09-21T01:39:31.760704Z", "iso8601": "2024-09-21T01:39:31Z", "iso8601_basic": "20240920T213931760704", "iso8601_basic_short": "20240920T213931", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "ethtest0", "peerethtest0", "rpltstbr", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "92:35:3e:53:1a:d5", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::9035:3eff:fe53:1ad5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ce:7d:c7:1b:e6:34", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "fe80::efc8:b960:f816:d9e0", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "198.51.100.3", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::9035:3eff:fe53:1ad5", "fe80::efc8:b960:f816:d9e0", "fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72", "198.51.100.3"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1", "fe80::9035:3eff:fe53:1ad5", "fe80::efc8:b960:f816:d9e0"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.42, "5m": 0.41, "15m": 0.25}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882771.86099: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882771.114399-29226-63744026022630/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882771.86126: _low_level_execute_command(): starting 28173 1726882771.86136: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882771.114399-29226-63744026022630/ > /dev/null 2>&1 && sleep 0' 28173 1726882771.86759: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882771.86776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882771.86789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882771.86808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882771.86845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882771.86856: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882771.86878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882771.86896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882771.86907: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882771.86918: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882771.86928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882771.86940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882771.86952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882771.86962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882771.86977: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882771.86988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882771.87069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882771.87086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882771.87100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882771.87228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882771.89204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882771.89277: stderr chunk (state=3): >>><<< 28173 1726882771.89287: stdout chunk (state=3): >>><<< 28173 1726882771.89874: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882771.89878: handler run complete 28173 1726882771.89881: variable 'ansible_facts' from source: unknown 28173 1726882771.89883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882771.90032: variable 'ansible_facts' from source: unknown 28173 1726882771.90142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882771.90323: attempt loop complete, returning result 28173 1726882771.90335: _execute() done 28173 1726882771.90344: dumping result to json 28173 1726882771.90399: done dumping result, returning 28173 1726882771.90413: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-926c-8928-0000000006a2] 28173 1726882771.90424: sending task result for task 0e448fcc-3ce9-926c-8928-0000000006a2 ok: [managed_node2] 28173 1726882771.91376: no more pending results, returning what we have 28173 1726882771.91380: results queue empty 28173 1726882771.91381: checking for any_errors_fatal 28173 1726882771.91382: done checking for any_errors_fatal 28173 1726882771.91383: checking for max_fail_percentage 28173 1726882771.91385: done checking for max_fail_percentage 28173 1726882771.91386: checking to see if all hosts have failed and the running result is not ok 28173 1726882771.91386: done checking to see if all hosts have failed 28173 1726882771.91387: getting the remaining hosts for this loop 28173 1726882771.91389: done getting the remaining hosts for this loop 28173 1726882771.91393: getting the next task for host managed_node2 28173 1726882771.91399: done getting next task for host managed_node2 28173 1726882771.91401: ^ task is: TASK: meta (flush_handlers) 28173 1726882771.91403: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882771.91408: getting variables 28173 1726882771.91410: in VariableManager get_vars() 28173 1726882771.91444: Calling all_inventory to load vars for managed_node2 28173 1726882771.91447: Calling groups_inventory to load vars for managed_node2 28173 1726882771.91449: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882771.91462: Calling all_plugins_play to load vars for managed_node2 28173 1726882771.91470: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882771.91473: Calling groups_plugins_play to load vars for managed_node2 28173 1726882771.92285: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000006a2 28173 1726882771.92288: WORKER PROCESS EXITING 28173 1726882771.93283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882771.95115: done with get_vars() 28173 1726882771.95145: done getting variables 28173 1726882771.95229: in VariableManager get_vars() 28173 1726882771.95244: Calling all_inventory to load vars for managed_node2 28173 1726882771.95246: Calling groups_inventory to load vars for managed_node2 28173 1726882771.95248: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882771.95254: Calling all_plugins_play to load vars for managed_node2 28173 1726882771.95256: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882771.95274: Calling groups_plugins_play to load vars for managed_node2 28173 1726882771.96556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882771.98436: done with get_vars() 28173 1726882771.98471: done queuing things up, now waiting for results queue to drain 28173 1726882771.98473: results queue empty 28173 1726882771.98474: checking for any_errors_fatal 28173 1726882771.98478: done checking for any_errors_fatal 28173 1726882771.98479: checking for max_fail_percentage 28173 1726882771.98480: done checking for max_fail_percentage 28173 1726882771.98481: checking to see if all hosts have failed and the running result is not ok 28173 1726882771.98482: done checking to see if all hosts have failed 28173 1726882771.98483: getting the remaining hosts for this loop 28173 1726882771.98484: done getting the remaining hosts for this loop 28173 1726882771.98486: getting the next task for host managed_node2 28173 1726882771.98490: done getting next task for host managed_node2 28173 1726882771.98494: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28173 1726882771.98496: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882771.98507: getting variables 28173 1726882771.98508: in VariableManager get_vars() 28173 1726882771.98524: Calling all_inventory to load vars for managed_node2 28173 1726882771.98526: Calling groups_inventory to load vars for managed_node2 28173 1726882771.98528: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882771.98533: Calling all_plugins_play to load vars for managed_node2 28173 1726882771.98536: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882771.98539: Calling groups_plugins_play to load vars for managed_node2 28173 1726882772.00446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882772.03727: done with get_vars() 28173 1726882772.03750: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:39:32 -0400 (0:00:00.976) 0:00:25.202 ****** 28173 1726882772.03831: entering _queue_task() for managed_node2/include_tasks 28173 1726882772.04638: worker is 1 (out of 1 available) 28173 1726882772.04654: exiting _queue_task() for managed_node2/include_tasks 28173 1726882772.05160: done queuing things up, now waiting for results queue to drain 28173 1726882772.05162: waiting for pending results... 28173 1726882772.05490: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28173 1726882772.05625: in run() - task 0e448fcc-3ce9-926c-8928-0000000000b7 28173 1726882772.05648: variable 'ansible_search_path' from source: unknown 28173 1726882772.05655: variable 'ansible_search_path' from source: unknown 28173 1726882772.05702: calling self._execute() 28173 1726882772.05919: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882772.05935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882772.05952: variable 'omit' from source: magic vars 28173 1726882772.06322: variable 'ansible_distribution_major_version' from source: facts 28173 1726882772.06337: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882772.06346: _execute() done 28173 1726882772.06352: dumping result to json 28173 1726882772.06357: done dumping result, returning 28173 1726882772.06375: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-926c-8928-0000000000b7] 28173 1726882772.06386: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000b7 28173 1726882772.06532: no more pending results, returning what we have 28173 1726882772.06538: in VariableManager get_vars() 28173 1726882772.06589: Calling all_inventory to load vars for managed_node2 28173 1726882772.06592: Calling groups_inventory to load vars for managed_node2 28173 1726882772.06594: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882772.06607: Calling all_plugins_play to load vars for managed_node2 28173 1726882772.06611: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882772.06614: Calling groups_plugins_play to load vars for managed_node2 28173 1726882772.08285: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000b7 28173 1726882772.08288: WORKER PROCESS EXITING 28173 1726882772.08652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882772.10455: done with get_vars() 28173 1726882772.10480: variable 'ansible_search_path' from source: unknown 28173 1726882772.10482: variable 'ansible_search_path' from source: unknown 28173 1726882772.10510: we have included files to process 28173 1726882772.10511: generating all_blocks data 28173 1726882772.10512: done generating all_blocks data 28173 1726882772.10513: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28173 1726882772.10514: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28173 1726882772.10516: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28173 1726882772.11087: done processing included file 28173 1726882772.11089: iterating over new_blocks loaded from include file 28173 1726882772.11091: in VariableManager get_vars() 28173 1726882772.11111: done with get_vars() 28173 1726882772.11112: filtering new block on tags 28173 1726882772.11128: done filtering new block on tags 28173 1726882772.11130: in VariableManager get_vars() 28173 1726882772.11147: done with get_vars() 28173 1726882772.11148: filtering new block on tags 28173 1726882772.11169: done filtering new block on tags 28173 1726882772.11172: in VariableManager get_vars() 28173 1726882772.11190: done with get_vars() 28173 1726882772.11192: filtering new block on tags 28173 1726882772.11208: done filtering new block on tags 28173 1726882772.11210: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28173 1726882772.11215: extending task lists for all hosts with included blocks 28173 1726882772.11597: done extending task lists 28173 1726882772.11599: done processing included files 28173 1726882772.11600: results queue empty 28173 1726882772.11600: checking for any_errors_fatal 28173 1726882772.11602: done checking for any_errors_fatal 28173 1726882772.11602: checking for max_fail_percentage 28173 1726882772.11603: done checking for max_fail_percentage 28173 1726882772.11604: checking to see if all hosts have failed and the running result is not ok 28173 1726882772.11605: done checking to see if all hosts have failed 28173 1726882772.11606: getting the remaining hosts for this loop 28173 1726882772.11607: done getting the remaining hosts for this loop 28173 1726882772.11610: getting the next task for host managed_node2 28173 1726882772.11614: done getting next task for host managed_node2 28173 1726882772.11616: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28173 1726882772.11619: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882772.11628: getting variables 28173 1726882772.11629: in VariableManager get_vars() 28173 1726882772.11642: Calling all_inventory to load vars for managed_node2 28173 1726882772.11645: Calling groups_inventory to load vars for managed_node2 28173 1726882772.11647: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882772.11652: Calling all_plugins_play to load vars for managed_node2 28173 1726882772.11654: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882772.11657: Calling groups_plugins_play to load vars for managed_node2 28173 1726882772.12957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882772.15099: done with get_vars() 28173 1726882772.15124: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:39:32 -0400 (0:00:00.113) 0:00:25.316 ****** 28173 1726882772.15208: entering _queue_task() for managed_node2/setup 28173 1726882772.16615: worker is 1 (out of 1 available) 28173 1726882772.16628: exiting _queue_task() for managed_node2/setup 28173 1726882772.16641: done queuing things up, now waiting for results queue to drain 28173 1726882772.16642: waiting for pending results... 28173 1726882772.17615: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28173 1726882772.17895: in run() - task 0e448fcc-3ce9-926c-8928-0000000006e3 28173 1726882772.17979: variable 'ansible_search_path' from source: unknown 28173 1726882772.17986: variable 'ansible_search_path' from source: unknown 28173 1726882772.18024: calling self._execute() 28173 1726882772.18155: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882772.18290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882772.18303: variable 'omit' from source: magic vars 28173 1726882772.18999: variable 'ansible_distribution_major_version' from source: facts 28173 1726882772.19057: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882772.19599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882772.24348: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882772.25202: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882772.25244: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882772.25287: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882772.25435: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882772.25632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882772.25672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882772.25702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882772.25861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882772.25887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882772.25938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882772.25974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882772.26087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882772.26130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882772.26184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882772.26452: variable '__network_required_facts' from source: role '' defaults 28173 1726882772.26607: variable 'ansible_facts' from source: unknown 28173 1726882772.28158: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28173 1726882772.28225: when evaluation is False, skipping this task 28173 1726882772.28233: _execute() done 28173 1726882772.28240: dumping result to json 28173 1726882772.28249: done dumping result, returning 28173 1726882772.28260: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-926c-8928-0000000006e3] 28173 1726882772.28278: sending task result for task 0e448fcc-3ce9-926c-8928-0000000006e3 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882772.28482: no more pending results, returning what we have 28173 1726882772.28488: results queue empty 28173 1726882772.28489: checking for any_errors_fatal 28173 1726882772.28491: done checking for any_errors_fatal 28173 1726882772.28491: checking for max_fail_percentage 28173 1726882772.28493: done checking for max_fail_percentage 28173 1726882772.28494: checking to see if all hosts have failed and the running result is not ok 28173 1726882772.28495: done checking to see if all hosts have failed 28173 1726882772.28496: getting the remaining hosts for this loop 28173 1726882772.28498: done getting the remaining hosts for this loop 28173 1726882772.28502: getting the next task for host managed_node2 28173 1726882772.28512: done getting next task for host managed_node2 28173 1726882772.28516: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28173 1726882772.28519: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882772.28533: getting variables 28173 1726882772.28535: in VariableManager get_vars() 28173 1726882772.28584: Calling all_inventory to load vars for managed_node2 28173 1726882772.28587: Calling groups_inventory to load vars for managed_node2 28173 1726882772.28590: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882772.28601: Calling all_plugins_play to load vars for managed_node2 28173 1726882772.28605: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882772.28608: Calling groups_plugins_play to load vars for managed_node2 28173 1726882772.29945: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000006e3 28173 1726882772.29949: WORKER PROCESS EXITING 28173 1726882772.31739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882772.35010: done with get_vars() 28173 1726882772.35038: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:39:32 -0400 (0:00:00.199) 0:00:25.515 ****** 28173 1726882772.35142: entering _queue_task() for managed_node2/stat 28173 1726882772.35449: worker is 1 (out of 1 available) 28173 1726882772.35461: exiting _queue_task() for managed_node2/stat 28173 1726882772.35476: done queuing things up, now waiting for results queue to drain 28173 1726882772.35477: waiting for pending results... 28173 1726882772.35844: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28173 1726882772.36021: in run() - task 0e448fcc-3ce9-926c-8928-0000000006e5 28173 1726882772.36068: variable 'ansible_search_path' from source: unknown 28173 1726882772.36077: variable 'ansible_search_path' from source: unknown 28173 1726882772.36114: calling self._execute() 28173 1726882772.36213: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882772.36228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882772.36244: variable 'omit' from source: magic vars 28173 1726882772.36611: variable 'ansible_distribution_major_version' from source: facts 28173 1726882772.36627: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882772.36794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882772.37077: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882772.37124: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882772.37167: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882772.37205: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882772.37295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882772.37322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882772.37357: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882772.37390: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882772.37485: variable '__network_is_ostree' from source: set_fact 28173 1726882772.37496: Evaluated conditional (not __network_is_ostree is defined): False 28173 1726882772.37503: when evaluation is False, skipping this task 28173 1726882772.37509: _execute() done 28173 1726882772.37515: dumping result to json 28173 1726882772.37521: done dumping result, returning 28173 1726882772.37530: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-926c-8928-0000000006e5] 28173 1726882772.37540: sending task result for task 0e448fcc-3ce9-926c-8928-0000000006e5 28173 1726882772.37645: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000006e5 28173 1726882772.37652: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28173 1726882772.37735: no more pending results, returning what we have 28173 1726882772.37740: results queue empty 28173 1726882772.37741: checking for any_errors_fatal 28173 1726882772.37748: done checking for any_errors_fatal 28173 1726882772.37749: checking for max_fail_percentage 28173 1726882772.37750: done checking for max_fail_percentage 28173 1726882772.37751: checking to see if all hosts have failed and the running result is not ok 28173 1726882772.37753: done checking to see if all hosts have failed 28173 1726882772.37753: getting the remaining hosts for this loop 28173 1726882772.37755: done getting the remaining hosts for this loop 28173 1726882772.37758: getting the next task for host managed_node2 28173 1726882772.37765: done getting next task for host managed_node2 28173 1726882772.37769: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28173 1726882772.37772: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882772.37789: getting variables 28173 1726882772.37794: in VariableManager get_vars() 28173 1726882772.37837: Calling all_inventory to load vars for managed_node2 28173 1726882772.37839: Calling groups_inventory to load vars for managed_node2 28173 1726882772.37842: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882772.37853: Calling all_plugins_play to load vars for managed_node2 28173 1726882772.37856: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882772.37858: Calling groups_plugins_play to load vars for managed_node2 28173 1726882772.40036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882772.42191: done with get_vars() 28173 1726882772.42221: done getting variables 28173 1726882772.42282: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:39:32 -0400 (0:00:00.071) 0:00:25.587 ****** 28173 1726882772.42320: entering _queue_task() for managed_node2/set_fact 28173 1726882772.42844: worker is 1 (out of 1 available) 28173 1726882772.42971: exiting _queue_task() for managed_node2/set_fact 28173 1726882772.42984: done queuing things up, now waiting for results queue to drain 28173 1726882772.42986: waiting for pending results... 28173 1726882772.43466: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28173 1726882772.43604: in run() - task 0e448fcc-3ce9-926c-8928-0000000006e6 28173 1726882772.43631: variable 'ansible_search_path' from source: unknown 28173 1726882772.43638: variable 'ansible_search_path' from source: unknown 28173 1726882772.43681: calling self._execute() 28173 1726882772.43785: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882772.43795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882772.43807: variable 'omit' from source: magic vars 28173 1726882772.44186: variable 'ansible_distribution_major_version' from source: facts 28173 1726882772.44204: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882772.44382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882772.44673: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882772.44729: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882772.44768: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882772.44810: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882772.44985: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882772.45027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882772.45062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882772.45121: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882772.45225: variable '__network_is_ostree' from source: set_fact 28173 1726882772.45246: Evaluated conditional (not __network_is_ostree is defined): False 28173 1726882772.45257: when evaluation is False, skipping this task 28173 1726882772.45263: _execute() done 28173 1726882772.45272: dumping result to json 28173 1726882772.45279: done dumping result, returning 28173 1726882772.45288: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-926c-8928-0000000006e6] 28173 1726882772.45297: sending task result for task 0e448fcc-3ce9-926c-8928-0000000006e6 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28173 1726882772.45439: no more pending results, returning what we have 28173 1726882772.45442: results queue empty 28173 1726882772.45443: checking for any_errors_fatal 28173 1726882772.45450: done checking for any_errors_fatal 28173 1726882772.45451: checking for max_fail_percentage 28173 1726882772.45453: done checking for max_fail_percentage 28173 1726882772.45454: checking to see if all hosts have failed and the running result is not ok 28173 1726882772.45455: done checking to see if all hosts have failed 28173 1726882772.45455: getting the remaining hosts for this loop 28173 1726882772.45457: done getting the remaining hosts for this loop 28173 1726882772.45461: getting the next task for host managed_node2 28173 1726882772.45471: done getting next task for host managed_node2 28173 1726882772.45476: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28173 1726882772.45479: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882772.45492: getting variables 28173 1726882772.45494: in VariableManager get_vars() 28173 1726882772.45532: Calling all_inventory to load vars for managed_node2 28173 1726882772.45535: Calling groups_inventory to load vars for managed_node2 28173 1726882772.45538: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882772.45548: Calling all_plugins_play to load vars for managed_node2 28173 1726882772.45551: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882772.45554: Calling groups_plugins_play to load vars for managed_node2 28173 1726882772.46655: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000006e6 28173 1726882772.46659: WORKER PROCESS EXITING 28173 1726882772.47487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882772.49667: done with get_vars() 28173 1726882772.49693: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:39:32 -0400 (0:00:00.075) 0:00:25.663 ****** 28173 1726882772.49903: entering _queue_task() for managed_node2/service_facts 28173 1726882772.50490: worker is 1 (out of 1 available) 28173 1726882772.50503: exiting _queue_task() for managed_node2/service_facts 28173 1726882772.50515: done queuing things up, now waiting for results queue to drain 28173 1726882772.50516: waiting for pending results... 28173 1726882772.51294: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28173 1726882772.51440: in run() - task 0e448fcc-3ce9-926c-8928-0000000006e8 28173 1726882772.51462: variable 'ansible_search_path' from source: unknown 28173 1726882772.51485: variable 'ansible_search_path' from source: unknown 28173 1726882772.51524: calling self._execute() 28173 1726882772.51639: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882772.51655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882772.51686: variable 'omit' from source: magic vars 28173 1726882772.52093: variable 'ansible_distribution_major_version' from source: facts 28173 1726882772.52111: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882772.52123: variable 'omit' from source: magic vars 28173 1726882772.52194: variable 'omit' from source: magic vars 28173 1726882772.52234: variable 'omit' from source: magic vars 28173 1726882772.52311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882772.52365: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882772.52393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882772.52419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882772.52434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882772.52470: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882772.52479: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882772.52487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882772.52596: Set connection var ansible_pipelining to False 28173 1726882772.52604: Set connection var ansible_shell_type to sh 28173 1726882772.52623: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882772.52635: Set connection var ansible_timeout to 10 28173 1726882772.52644: Set connection var ansible_shell_executable to /bin/sh 28173 1726882772.52653: Set connection var ansible_connection to ssh 28173 1726882772.52684: variable 'ansible_shell_executable' from source: unknown 28173 1726882772.52692: variable 'ansible_connection' from source: unknown 28173 1726882772.52699: variable 'ansible_module_compression' from source: unknown 28173 1726882772.52705: variable 'ansible_shell_type' from source: unknown 28173 1726882772.52711: variable 'ansible_shell_executable' from source: unknown 28173 1726882772.52718: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882772.52729: variable 'ansible_pipelining' from source: unknown 28173 1726882772.52735: variable 'ansible_timeout' from source: unknown 28173 1726882772.52743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882772.52983: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882772.53000: variable 'omit' from source: magic vars 28173 1726882772.53009: starting attempt loop 28173 1726882772.53017: running the handler 28173 1726882772.53034: _low_level_execute_command(): starting 28173 1726882772.53045: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882772.54500: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882772.54515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882772.54529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882772.54553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882772.54596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882772.54608: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882772.54622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882772.54639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882772.54650: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882772.54669: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882772.54683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882772.54697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882772.54712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882772.54725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882772.54736: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882772.54749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882772.54833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882772.54855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882772.54876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882772.55021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882772.56680: stdout chunk (state=3): >>>/root <<< 28173 1726882772.56780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882772.56867: stderr chunk (state=3): >>><<< 28173 1726882772.56880: stdout chunk (state=3): >>><<< 28173 1726882772.56998: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882772.57001: _low_level_execute_command(): starting 28173 1726882772.57004: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882772.5690646-29281-19582914240298 `" && echo ansible-tmp-1726882772.5690646-29281-19582914240298="` echo /root/.ansible/tmp/ansible-tmp-1726882772.5690646-29281-19582914240298 `" ) && sleep 0' 28173 1726882772.57578: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882772.57591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882772.57605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882772.57622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882772.57676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882772.57689: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882772.57702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882772.57719: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882772.57731: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882772.57742: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882772.57762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882772.57779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882772.57795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882772.57808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882772.57819: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882772.57833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882772.57916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882772.57937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882772.57952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882772.58097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882772.59958: stdout chunk (state=3): >>>ansible-tmp-1726882772.5690646-29281-19582914240298=/root/.ansible/tmp/ansible-tmp-1726882772.5690646-29281-19582914240298 <<< 28173 1726882772.60078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882772.60111: stderr chunk (state=3): >>><<< 28173 1726882772.60114: stdout chunk (state=3): >>><<< 28173 1726882772.60128: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882772.5690646-29281-19582914240298=/root/.ansible/tmp/ansible-tmp-1726882772.5690646-29281-19582914240298 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882772.60169: variable 'ansible_module_compression' from source: unknown 28173 1726882772.60198: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28173 1726882772.60223: variable 'ansible_facts' from source: unknown 28173 1726882772.60282: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882772.5690646-29281-19582914240298/AnsiballZ_service_facts.py 28173 1726882772.60381: Sending initial data 28173 1726882772.60389: Sent initial data (161 bytes) 28173 1726882772.61204: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882772.61217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882772.61233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882772.61258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882772.61302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882772.61314: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882772.61326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882772.61342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882772.61361: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882772.61378: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882772.61390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882772.61402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882772.61417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882772.61427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882772.61436: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882772.61449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882772.61535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882772.61552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882772.61573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882772.61714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882772.63461: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882772.63556: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882772.63654: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp6pawzbx5 /root/.ansible/tmp/ansible-tmp-1726882772.5690646-29281-19582914240298/AnsiballZ_service_facts.py <<< 28173 1726882772.63749: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882772.65018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882772.65168: stderr chunk (state=3): >>><<< 28173 1726882772.65172: stdout chunk (state=3): >>><<< 28173 1726882772.65174: done transferring module to remote 28173 1726882772.65179: _low_level_execute_command(): starting 28173 1726882772.65251: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882772.5690646-29281-19582914240298/ /root/.ansible/tmp/ansible-tmp-1726882772.5690646-29281-19582914240298/AnsiballZ_service_facts.py && sleep 0' 28173 1726882772.65793: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882772.65798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882772.65818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882772.65844: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882772.65846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 28173 1726882772.65849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882772.65851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882772.65910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882772.65913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882772.65921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882772.66021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882772.67803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882772.68071: stderr chunk (state=3): >>><<< 28173 1726882772.68076: stdout chunk (state=3): >>><<< 28173 1726882772.69211: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882772.69215: _low_level_execute_command(): starting 28173 1726882772.69217: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882772.5690646-29281-19582914240298/AnsiballZ_service_facts.py && sleep 0' 28173 1726882772.69513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882772.69516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882772.69551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882772.69554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882772.69569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882772.69613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882772.69630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882772.69749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882774.03157: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28173 1726882774.04383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882774.04445: stderr chunk (state=3): >>><<< 28173 1726882774.04450: stdout chunk (state=3): >>><<< 28173 1726882774.04578: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882774.14333: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882772.5690646-29281-19582914240298/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882774.14350: _low_level_execute_command(): starting 28173 1726882774.14369: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882772.5690646-29281-19582914240298/ > /dev/null 2>&1 && sleep 0' 28173 1726882774.15124: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882774.15142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882774.15158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882774.15179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882774.15221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882774.15236: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882774.15253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882774.15275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882774.15288: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882774.15299: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882774.15311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882774.15324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882774.15341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882774.15359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882774.15374: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882774.15390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882774.15465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882774.15486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882774.15501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882774.15649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882774.17503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882774.17558: stderr chunk (state=3): >>><<< 28173 1726882774.17561: stdout chunk (state=3): >>><<< 28173 1726882774.17583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882774.17586: handler run complete 28173 1726882774.17755: variable 'ansible_facts' from source: unknown 28173 1726882774.17899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882774.18337: variable 'ansible_facts' from source: unknown 28173 1726882774.18461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882774.18656: attempt loop complete, returning result 28173 1726882774.18659: _execute() done 28173 1726882774.18662: dumping result to json 28173 1726882774.18728: done dumping result, returning 28173 1726882774.18736: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-926c-8928-0000000006e8] 28173 1726882774.18740: sending task result for task 0e448fcc-3ce9-926c-8928-0000000006e8 28173 1726882774.23804: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000006e8 28173 1726882774.23807: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882774.23900: no more pending results, returning what we have 28173 1726882774.23902: results queue empty 28173 1726882774.23903: checking for any_errors_fatal 28173 1726882774.23905: done checking for any_errors_fatal 28173 1726882774.23906: checking for max_fail_percentage 28173 1726882774.23907: done checking for max_fail_percentage 28173 1726882774.23908: checking to see if all hosts have failed and the running result is not ok 28173 1726882774.23908: done checking to see if all hosts have failed 28173 1726882774.23909: getting the remaining hosts for this loop 28173 1726882774.23910: done getting the remaining hosts for this loop 28173 1726882774.23913: getting the next task for host managed_node2 28173 1726882774.23916: done getting next task for host managed_node2 28173 1726882774.23920: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28173 1726882774.23923: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882774.23930: getting variables 28173 1726882774.23931: in VariableManager get_vars() 28173 1726882774.23951: Calling all_inventory to load vars for managed_node2 28173 1726882774.23953: Calling groups_inventory to load vars for managed_node2 28173 1726882774.23955: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882774.23960: Calling all_plugins_play to load vars for managed_node2 28173 1726882774.23965: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882774.23968: Calling groups_plugins_play to load vars for managed_node2 28173 1726882774.25234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882774.27038: done with get_vars() 28173 1726882774.27061: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:39:34 -0400 (0:00:01.772) 0:00:27.435 ****** 28173 1726882774.27142: entering _queue_task() for managed_node2/package_facts 28173 1726882774.27457: worker is 1 (out of 1 available) 28173 1726882774.27499: exiting _queue_task() for managed_node2/package_facts 28173 1726882774.27511: done queuing things up, now waiting for results queue to drain 28173 1726882774.27512: waiting for pending results... 28173 1726882774.27807: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28173 1726882774.27930: in run() - task 0e448fcc-3ce9-926c-8928-0000000006e9 28173 1726882774.27950: variable 'ansible_search_path' from source: unknown 28173 1726882774.27960: variable 'ansible_search_path' from source: unknown 28173 1726882774.28003: calling self._execute() 28173 1726882774.28113: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882774.28123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882774.28138: variable 'omit' from source: magic vars 28173 1726882774.28550: variable 'ansible_distribution_major_version' from source: facts 28173 1726882774.28569: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882774.28581: variable 'omit' from source: magic vars 28173 1726882774.28642: variable 'omit' from source: magic vars 28173 1726882774.28683: variable 'omit' from source: magic vars 28173 1726882774.28730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882774.28770: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882774.28793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882774.28813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882774.28836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882774.28870: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882774.28879: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882774.28887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882774.28998: Set connection var ansible_pipelining to False 28173 1726882774.29005: Set connection var ansible_shell_type to sh 28173 1726882774.29020: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882774.29036: Set connection var ansible_timeout to 10 28173 1726882774.29051: Set connection var ansible_shell_executable to /bin/sh 28173 1726882774.29060: Set connection var ansible_connection to ssh 28173 1726882774.29088: variable 'ansible_shell_executable' from source: unknown 28173 1726882774.29095: variable 'ansible_connection' from source: unknown 28173 1726882774.29101: variable 'ansible_module_compression' from source: unknown 28173 1726882774.29107: variable 'ansible_shell_type' from source: unknown 28173 1726882774.29112: variable 'ansible_shell_executable' from source: unknown 28173 1726882774.29117: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882774.29123: variable 'ansible_pipelining' from source: unknown 28173 1726882774.29129: variable 'ansible_timeout' from source: unknown 28173 1726882774.29134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882774.29332: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882774.29346: variable 'omit' from source: magic vars 28173 1726882774.29355: starting attempt loop 28173 1726882774.29363: running the handler 28173 1726882774.29389: _low_level_execute_command(): starting 28173 1726882774.29401: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882774.30149: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882774.30172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882774.30188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882774.30205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882774.30254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882774.30280: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882774.30295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882774.30312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882774.30322: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882774.30332: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882774.30343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882774.30355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882774.30371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882774.30390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882774.30402: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882774.30415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882774.30496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882774.30520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882774.30535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882774.30667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882774.32327: stdout chunk (state=3): >>>/root <<< 28173 1726882774.32474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882774.32502: stderr chunk (state=3): >>><<< 28173 1726882774.32512: stdout chunk (state=3): >>><<< 28173 1726882774.32611: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882774.32614: _low_level_execute_command(): starting 28173 1726882774.32617: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882774.325339-29354-223555651591784 `" && echo ansible-tmp-1726882774.325339-29354-223555651591784="` echo /root/.ansible/tmp/ansible-tmp-1726882774.325339-29354-223555651591784 `" ) && sleep 0' 28173 1726882774.34608: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882774.34612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882774.34636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882774.34651: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882774.34674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882774.34696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882774.34711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882774.34725: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882774.34740: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882774.34756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882774.34779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882774.34793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882774.34805: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882774.34817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882774.34889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882774.35583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882774.35598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882774.35731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882774.37656: stdout chunk (state=3): >>>ansible-tmp-1726882774.325339-29354-223555651591784=/root/.ansible/tmp/ansible-tmp-1726882774.325339-29354-223555651591784 <<< 28173 1726882774.37854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882774.37858: stdout chunk (state=3): >>><<< 28173 1726882774.37860: stderr chunk (state=3): >>><<< 28173 1726882774.37969: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882774.325339-29354-223555651591784=/root/.ansible/tmp/ansible-tmp-1726882774.325339-29354-223555651591784 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882774.37973: variable 'ansible_module_compression' from source: unknown 28173 1726882774.38175: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28173 1726882774.38178: variable 'ansible_facts' from source: unknown 28173 1726882774.38231: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882774.325339-29354-223555651591784/AnsiballZ_package_facts.py 28173 1726882774.38856: Sending initial data 28173 1726882774.38860: Sent initial data (161 bytes) 28173 1726882774.41818: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882774.41834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882774.41849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882774.41876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882774.41919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882774.41932: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882774.41947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882774.41970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882774.41984: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882774.41997: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882774.42009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882774.42026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882774.42044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882774.42057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882774.42073: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882774.42088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882774.42163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882774.42297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882774.42313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882774.42493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882774.44323: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882774.44424: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882774.44518: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpx3ert_ys /root/.ansible/tmp/ansible-tmp-1726882774.325339-29354-223555651591784/AnsiballZ_package_facts.py <<< 28173 1726882774.44620: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882774.47883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882774.48072: stderr chunk (state=3): >>><<< 28173 1726882774.48076: stdout chunk (state=3): >>><<< 28173 1726882774.48079: done transferring module to remote 28173 1726882774.48081: _low_level_execute_command(): starting 28173 1726882774.48083: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882774.325339-29354-223555651591784/ /root/.ansible/tmp/ansible-tmp-1726882774.325339-29354-223555651591784/AnsiballZ_package_facts.py && sleep 0' 28173 1726882774.48693: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882774.48708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882774.48723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882774.48745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882774.48793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882774.48805: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882774.48820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882774.48837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882774.48853: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882774.48870: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882774.48903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882774.48919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882774.48935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882774.48947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882774.48961: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882774.48981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882774.49089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882774.49111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882774.49127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882774.49284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882774.51071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882774.51125: stderr chunk (state=3): >>><<< 28173 1726882774.51128: stdout chunk (state=3): >>><<< 28173 1726882774.51217: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882774.51220: _low_level_execute_command(): starting 28173 1726882774.51223: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882774.325339-29354-223555651591784/AnsiballZ_package_facts.py && sleep 0' 28173 1726882774.52102: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882774.52116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882774.52128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882774.52145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882774.52189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882774.52200: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882774.52216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882774.52232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882774.52242: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882774.52252: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882774.52263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882774.52282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882774.52297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882774.52307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882774.52317: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882774.52332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882774.52410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882774.52433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882774.52447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882774.52581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882774.98677: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"<<< 28173 1726882774.98761: stdout chunk (state=3): >>>}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x8<<< 28173 1726882774.98777: stdout chunk (state=3): >>>6_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba<<< 28173 1726882774.98786: stdout chunk (state=3): >>>", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epo<<< 28173 1726882774.98791: stdout chunk (state=3): >>>ch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch"<<< 28173 1726882774.98797: stdout chunk (state=3): >>>: 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source":<<< 28173 1726882774.98805: stdout chunk (state=3): >>> "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rp<<< 28173 1726882774.98809: stdout chunk (state=3): >>>m"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1"<<< 28173 1726882774.98837: stdout chunk (state=3): >>>, "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "<<< 28173 1726882774.98877: stdout chunk (state=3): >>>8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28173 1726882775.00443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882775.00446: stdout chunk (state=3): >>><<< 28173 1726882775.00449: stderr chunk (state=3): >>><<< 28173 1726882775.00680: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882775.05245: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882774.325339-29354-223555651591784/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882775.05389: _low_level_execute_command(): starting 28173 1726882775.05400: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882774.325339-29354-223555651591784/ > /dev/null 2>&1 && sleep 0' 28173 1726882775.07354: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882775.07363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882775.07378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882775.07391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882775.07432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882775.07435: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882775.07446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882775.07459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882775.07470: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882775.07476: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882775.07485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882775.07493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882775.07504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882775.07511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882775.07517: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882775.07526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882775.07599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882775.07617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882775.07629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882775.07761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882775.09704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882775.09708: stdout chunk (state=3): >>><<< 28173 1726882775.09710: stderr chunk (state=3): >>><<< 28173 1726882775.09721: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882775.09728: handler run complete 28173 1726882775.11302: variable 'ansible_facts' from source: unknown 28173 1726882775.12074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882775.16625: variable 'ansible_facts' from source: unknown 28173 1726882775.17646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882775.19349: attempt loop complete, returning result 28173 1726882775.19361: _execute() done 28173 1726882775.19369: dumping result to json 28173 1726882775.19754: done dumping result, returning 28173 1726882775.19781: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-926c-8928-0000000006e9] 28173 1726882775.19791: sending task result for task 0e448fcc-3ce9-926c-8928-0000000006e9 28173 1726882775.23055: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000006e9 28173 1726882775.23058: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882775.23210: no more pending results, returning what we have 28173 1726882775.23213: results queue empty 28173 1726882775.23214: checking for any_errors_fatal 28173 1726882775.23218: done checking for any_errors_fatal 28173 1726882775.23219: checking for max_fail_percentage 28173 1726882775.23220: done checking for max_fail_percentage 28173 1726882775.23221: checking to see if all hosts have failed and the running result is not ok 28173 1726882775.23222: done checking to see if all hosts have failed 28173 1726882775.23223: getting the remaining hosts for this loop 28173 1726882775.23224: done getting the remaining hosts for this loop 28173 1726882775.23227: getting the next task for host managed_node2 28173 1726882775.23234: done getting next task for host managed_node2 28173 1726882775.23238: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28173 1726882775.23240: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882775.23250: getting variables 28173 1726882775.23252: in VariableManager get_vars() 28173 1726882775.23290: Calling all_inventory to load vars for managed_node2 28173 1726882775.23293: Calling groups_inventory to load vars for managed_node2 28173 1726882775.23295: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882775.23305: Calling all_plugins_play to load vars for managed_node2 28173 1726882775.23308: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882775.23311: Calling groups_plugins_play to load vars for managed_node2 28173 1726882775.25149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882775.26654: done with get_vars() 28173 1726882775.26676: done getting variables 28173 1726882775.26717: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:39:35 -0400 (0:00:00.995) 0:00:28.431 ****** 28173 1726882775.26743: entering _queue_task() for managed_node2/debug 28173 1726882775.27022: worker is 1 (out of 1 available) 28173 1726882775.27034: exiting _queue_task() for managed_node2/debug 28173 1726882775.27045: done queuing things up, now waiting for results queue to drain 28173 1726882775.27047: waiting for pending results... 28173 1726882775.27353: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28173 1726882775.27475: in run() - task 0e448fcc-3ce9-926c-8928-0000000000b8 28173 1726882775.27500: variable 'ansible_search_path' from source: unknown 28173 1726882775.27506: variable 'ansible_search_path' from source: unknown 28173 1726882775.27549: calling self._execute() 28173 1726882775.27658: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882775.27674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882775.27690: variable 'omit' from source: magic vars 28173 1726882775.28124: variable 'ansible_distribution_major_version' from source: facts 28173 1726882775.28156: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882775.28170: variable 'omit' from source: magic vars 28173 1726882775.28197: variable 'omit' from source: magic vars 28173 1726882775.28292: variable 'network_provider' from source: set_fact 28173 1726882775.28308: variable 'omit' from source: magic vars 28173 1726882775.28348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882775.28386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882775.28400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882775.28414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882775.28427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882775.28462: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882775.28473: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882775.28480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882775.28574: Set connection var ansible_pipelining to False 28173 1726882775.28582: Set connection var ansible_shell_type to sh 28173 1726882775.28597: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882775.28607: Set connection var ansible_timeout to 10 28173 1726882775.28615: Set connection var ansible_shell_executable to /bin/sh 28173 1726882775.28623: Set connection var ansible_connection to ssh 28173 1726882775.28647: variable 'ansible_shell_executable' from source: unknown 28173 1726882775.28654: variable 'ansible_connection' from source: unknown 28173 1726882775.28659: variable 'ansible_module_compression' from source: unknown 28173 1726882775.28671: variable 'ansible_shell_type' from source: unknown 28173 1726882775.28678: variable 'ansible_shell_executable' from source: unknown 28173 1726882775.28683: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882775.28690: variable 'ansible_pipelining' from source: unknown 28173 1726882775.28698: variable 'ansible_timeout' from source: unknown 28173 1726882775.28705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882775.28839: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882775.28854: variable 'omit' from source: magic vars 28173 1726882775.28867: starting attempt loop 28173 1726882775.28875: running the handler 28173 1726882775.28922: handler run complete 28173 1726882775.28944: attempt loop complete, returning result 28173 1726882775.28951: _execute() done 28173 1726882775.28957: dumping result to json 28173 1726882775.28962: done dumping result, returning 28173 1726882775.28978: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-926c-8928-0000000000b8] 28173 1726882775.28989: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000b8 28173 1726882775.29093: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000b8 28173 1726882775.29100: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28173 1726882775.29357: no more pending results, returning what we have 28173 1726882775.29359: results queue empty 28173 1726882775.29360: checking for any_errors_fatal 28173 1726882775.29369: done checking for any_errors_fatal 28173 1726882775.29369: checking for max_fail_percentage 28173 1726882775.29371: done checking for max_fail_percentage 28173 1726882775.29371: checking to see if all hosts have failed and the running result is not ok 28173 1726882775.29372: done checking to see if all hosts have failed 28173 1726882775.29372: getting the remaining hosts for this loop 28173 1726882775.29373: done getting the remaining hosts for this loop 28173 1726882775.29375: getting the next task for host managed_node2 28173 1726882775.29379: done getting next task for host managed_node2 28173 1726882775.29381: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28173 1726882775.29383: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882775.29389: getting variables 28173 1726882775.29390: in VariableManager get_vars() 28173 1726882775.29422: Calling all_inventory to load vars for managed_node2 28173 1726882775.29424: Calling groups_inventory to load vars for managed_node2 28173 1726882775.29427: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882775.29438: Calling all_plugins_play to load vars for managed_node2 28173 1726882775.29440: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882775.29441: Calling groups_plugins_play to load vars for managed_node2 28173 1726882775.31422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882775.32754: done with get_vars() 28173 1726882775.32775: done getting variables 28173 1726882775.32856: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:39:35 -0400 (0:00:00.061) 0:00:28.493 ****** 28173 1726882775.32890: entering _queue_task() for managed_node2/fail 28173 1726882775.33202: worker is 1 (out of 1 available) 28173 1726882775.33214: exiting _queue_task() for managed_node2/fail 28173 1726882775.33226: done queuing things up, now waiting for results queue to drain 28173 1726882775.33227: waiting for pending results... 28173 1726882775.33535: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28173 1726882775.33662: in run() - task 0e448fcc-3ce9-926c-8928-0000000000b9 28173 1726882775.33688: variable 'ansible_search_path' from source: unknown 28173 1726882775.33696: variable 'ansible_search_path' from source: unknown 28173 1726882775.33745: calling self._execute() 28173 1726882775.33909: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882775.33931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882775.33938: variable 'omit' from source: magic vars 28173 1726882775.34802: variable 'ansible_distribution_major_version' from source: facts 28173 1726882775.34805: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882775.34808: variable 'network_state' from source: role '' defaults 28173 1726882775.34810: Evaluated conditional (network_state != {}): False 28173 1726882775.34812: when evaluation is False, skipping this task 28173 1726882775.34814: _execute() done 28173 1726882775.34816: dumping result to json 28173 1726882775.34818: done dumping result, returning 28173 1726882775.34824: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-926c-8928-0000000000b9] 28173 1726882775.34827: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000b9 28173 1726882775.34901: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000b9 28173 1726882775.34904: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882775.34949: no more pending results, returning what we have 28173 1726882775.34952: results queue empty 28173 1726882775.34953: checking for any_errors_fatal 28173 1726882775.34960: done checking for any_errors_fatal 28173 1726882775.34961: checking for max_fail_percentage 28173 1726882775.34967: done checking for max_fail_percentage 28173 1726882775.34968: checking to see if all hosts have failed and the running result is not ok 28173 1726882775.34969: done checking to see if all hosts have failed 28173 1726882775.34970: getting the remaining hosts for this loop 28173 1726882775.34971: done getting the remaining hosts for this loop 28173 1726882775.34974: getting the next task for host managed_node2 28173 1726882775.34978: done getting next task for host managed_node2 28173 1726882775.34982: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28173 1726882775.34987: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882775.34996: getting variables 28173 1726882775.34998: in VariableManager get_vars() 28173 1726882775.35024: Calling all_inventory to load vars for managed_node2 28173 1726882775.35026: Calling groups_inventory to load vars for managed_node2 28173 1726882775.35028: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882775.35037: Calling all_plugins_play to load vars for managed_node2 28173 1726882775.35039: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882775.35040: Calling groups_plugins_play to load vars for managed_node2 28173 1726882775.36578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882775.37556: done with get_vars() 28173 1726882775.37576: done getting variables 28173 1726882775.37620: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:39:35 -0400 (0:00:00.047) 0:00:28.540 ****** 28173 1726882775.37641: entering _queue_task() for managed_node2/fail 28173 1726882775.37839: worker is 1 (out of 1 available) 28173 1726882775.37852: exiting _queue_task() for managed_node2/fail 28173 1726882775.37865: done queuing things up, now waiting for results queue to drain 28173 1726882775.37869: waiting for pending results... 28173 1726882775.38049: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28173 1726882775.38122: in run() - task 0e448fcc-3ce9-926c-8928-0000000000ba 28173 1726882775.38184: variable 'ansible_search_path' from source: unknown 28173 1726882775.38187: variable 'ansible_search_path' from source: unknown 28173 1726882775.38211: calling self._execute() 28173 1726882775.38317: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882775.38330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882775.38346: variable 'omit' from source: magic vars 28173 1726882775.38775: variable 'ansible_distribution_major_version' from source: facts 28173 1726882775.38792: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882775.38929: variable 'network_state' from source: role '' defaults 28173 1726882775.38949: Evaluated conditional (network_state != {}): False 28173 1726882775.38956: when evaluation is False, skipping this task 28173 1726882775.38968: _execute() done 28173 1726882775.38977: dumping result to json 28173 1726882775.38987: done dumping result, returning 28173 1726882775.39000: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-926c-8928-0000000000ba] 28173 1726882775.39013: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ba skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882775.39174: no more pending results, returning what we have 28173 1726882775.39178: results queue empty 28173 1726882775.39179: checking for any_errors_fatal 28173 1726882775.39187: done checking for any_errors_fatal 28173 1726882775.39188: checking for max_fail_percentage 28173 1726882775.39190: done checking for max_fail_percentage 28173 1726882775.39191: checking to see if all hosts have failed and the running result is not ok 28173 1726882775.39191: done checking to see if all hosts have failed 28173 1726882775.39192: getting the remaining hosts for this loop 28173 1726882775.39194: done getting the remaining hosts for this loop 28173 1726882775.39197: getting the next task for host managed_node2 28173 1726882775.39203: done getting next task for host managed_node2 28173 1726882775.39207: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28173 1726882775.39209: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882775.39236: getting variables 28173 1726882775.39242: in VariableManager get_vars() 28173 1726882775.39303: Calling all_inventory to load vars for managed_node2 28173 1726882775.39306: Calling groups_inventory to load vars for managed_node2 28173 1726882775.39308: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882775.39319: Calling all_plugins_play to load vars for managed_node2 28173 1726882775.39322: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882775.39324: Calling groups_plugins_play to load vars for managed_node2 28173 1726882775.39895: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ba 28173 1726882775.39898: WORKER PROCESS EXITING 28173 1726882775.40230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882775.41216: done with get_vars() 28173 1726882775.41234: done getting variables 28173 1726882775.41278: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:39:35 -0400 (0:00:00.036) 0:00:28.577 ****** 28173 1726882775.41298: entering _queue_task() for managed_node2/fail 28173 1726882775.41489: worker is 1 (out of 1 available) 28173 1726882775.41504: exiting _queue_task() for managed_node2/fail 28173 1726882775.41515: done queuing things up, now waiting for results queue to drain 28173 1726882775.41517: waiting for pending results... 28173 1726882775.41693: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28173 1726882775.41758: in run() - task 0e448fcc-3ce9-926c-8928-0000000000bb 28173 1726882775.41773: variable 'ansible_search_path' from source: unknown 28173 1726882775.41778: variable 'ansible_search_path' from source: unknown 28173 1726882775.41804: calling self._execute() 28173 1726882775.41877: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882775.41881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882775.41892: variable 'omit' from source: magic vars 28173 1726882775.42155: variable 'ansible_distribution_major_version' from source: facts 28173 1726882775.42168: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882775.42285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882775.44113: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882775.44156: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882775.44187: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882775.44211: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882775.44230: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882775.44293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.44313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.44330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.44357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.44372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.44437: variable 'ansible_distribution_major_version' from source: facts 28173 1726882775.44448: Evaluated conditional (ansible_distribution_major_version | int > 9): False 28173 1726882775.44451: when evaluation is False, skipping this task 28173 1726882775.44456: _execute() done 28173 1726882775.44458: dumping result to json 28173 1726882775.44461: done dumping result, returning 28173 1726882775.44468: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-926c-8928-0000000000bb] 28173 1726882775.44476: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000bb 28173 1726882775.44559: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000bb 28173 1726882775.44562: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 28173 1726882775.44624: no more pending results, returning what we have 28173 1726882775.44628: results queue empty 28173 1726882775.44629: checking for any_errors_fatal 28173 1726882775.44634: done checking for any_errors_fatal 28173 1726882775.44634: checking for max_fail_percentage 28173 1726882775.44636: done checking for max_fail_percentage 28173 1726882775.44637: checking to see if all hosts have failed and the running result is not ok 28173 1726882775.44638: done checking to see if all hosts have failed 28173 1726882775.44639: getting the remaining hosts for this loop 28173 1726882775.44640: done getting the remaining hosts for this loop 28173 1726882775.44643: getting the next task for host managed_node2 28173 1726882775.44648: done getting next task for host managed_node2 28173 1726882775.44652: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28173 1726882775.44654: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882775.44669: getting variables 28173 1726882775.44670: in VariableManager get_vars() 28173 1726882775.44713: Calling all_inventory to load vars for managed_node2 28173 1726882775.44716: Calling groups_inventory to load vars for managed_node2 28173 1726882775.44718: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882775.44727: Calling all_plugins_play to load vars for managed_node2 28173 1726882775.44730: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882775.44732: Calling groups_plugins_play to load vars for managed_node2 28173 1726882775.45647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882775.46609: done with get_vars() 28173 1726882775.46624: done getting variables 28173 1726882775.46669: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:39:35 -0400 (0:00:00.053) 0:00:28.631 ****** 28173 1726882775.46690: entering _queue_task() for managed_node2/dnf 28173 1726882775.46884: worker is 1 (out of 1 available) 28173 1726882775.46898: exiting _queue_task() for managed_node2/dnf 28173 1726882775.46909: done queuing things up, now waiting for results queue to drain 28173 1726882775.46910: waiting for pending results... 28173 1726882775.47111: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28173 1726882775.47192: in run() - task 0e448fcc-3ce9-926c-8928-0000000000bc 28173 1726882775.47205: variable 'ansible_search_path' from source: unknown 28173 1726882775.47208: variable 'ansible_search_path' from source: unknown 28173 1726882775.47236: calling self._execute() 28173 1726882775.47314: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882775.47320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882775.47328: variable 'omit' from source: magic vars 28173 1726882775.47598: variable 'ansible_distribution_major_version' from source: facts 28173 1726882775.47608: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882775.47746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882775.49878: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882775.49924: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882775.49950: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882775.49976: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882775.49996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882775.50053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.50087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.50103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.50135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.50145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.50225: variable 'ansible_distribution' from source: facts 28173 1726882775.50230: variable 'ansible_distribution_major_version' from source: facts 28173 1726882775.50243: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28173 1726882775.50316: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882775.50400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.50416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.50434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.50462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.50475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.50502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.50518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.50534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.50561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.50576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.50602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.50617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.50633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.50658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.50674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.50769: variable 'network_connections' from source: play vars 28173 1726882775.50776: variable 'profile' from source: play vars 28173 1726882775.50824: variable 'profile' from source: play vars 28173 1726882775.50827: variable 'interface' from source: set_fact 28173 1726882775.50873: variable 'interface' from source: set_fact 28173 1726882775.50918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882775.51053: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882775.51089: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882775.51129: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882775.51147: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882775.51480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882775.51483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882775.51493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.51495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882775.51497: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882775.51774: variable 'network_connections' from source: play vars 28173 1726882775.51778: variable 'profile' from source: play vars 28173 1726882775.51779: variable 'profile' from source: play vars 28173 1726882775.51781: variable 'interface' from source: set_fact 28173 1726882775.51783: variable 'interface' from source: set_fact 28173 1726882775.51784: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28173 1726882775.51786: when evaluation is False, skipping this task 28173 1726882775.51788: _execute() done 28173 1726882775.51789: dumping result to json 28173 1726882775.51791: done dumping result, returning 28173 1726882775.51793: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-0000000000bc] 28173 1726882775.51795: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000bc 28173 1726882775.51856: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000bc 28173 1726882775.51859: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28173 1726882775.51914: no more pending results, returning what we have 28173 1726882775.51917: results queue empty 28173 1726882775.51918: checking for any_errors_fatal 28173 1726882775.51923: done checking for any_errors_fatal 28173 1726882775.51923: checking for max_fail_percentage 28173 1726882775.51925: done checking for max_fail_percentage 28173 1726882775.51926: checking to see if all hosts have failed and the running result is not ok 28173 1726882775.51927: done checking to see if all hosts have failed 28173 1726882775.51927: getting the remaining hosts for this loop 28173 1726882775.51929: done getting the remaining hosts for this loop 28173 1726882775.51931: getting the next task for host managed_node2 28173 1726882775.51936: done getting next task for host managed_node2 28173 1726882775.51940: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28173 1726882775.51942: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882775.51952: getting variables 28173 1726882775.51954: in VariableManager get_vars() 28173 1726882775.51990: Calling all_inventory to load vars for managed_node2 28173 1726882775.51993: Calling groups_inventory to load vars for managed_node2 28173 1726882775.51995: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882775.52004: Calling all_plugins_play to load vars for managed_node2 28173 1726882775.52007: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882775.52009: Calling groups_plugins_play to load vars for managed_node2 28173 1726882775.53849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882775.55718: done with get_vars() 28173 1726882775.55747: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28173 1726882775.55822: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:39:35 -0400 (0:00:00.091) 0:00:28.722 ****** 28173 1726882775.55855: entering _queue_task() for managed_node2/yum 28173 1726882775.56153: worker is 1 (out of 1 available) 28173 1726882775.56175: exiting _queue_task() for managed_node2/yum 28173 1726882775.56187: done queuing things up, now waiting for results queue to drain 28173 1726882775.56188: waiting for pending results... 28173 1726882775.56487: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28173 1726882775.56595: in run() - task 0e448fcc-3ce9-926c-8928-0000000000bd 28173 1726882775.56620: variable 'ansible_search_path' from source: unknown 28173 1726882775.56633: variable 'ansible_search_path' from source: unknown 28173 1726882775.56678: calling self._execute() 28173 1726882775.56790: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882775.56802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882775.56818: variable 'omit' from source: magic vars 28173 1726882775.57249: variable 'ansible_distribution_major_version' from source: facts 28173 1726882775.57278: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882775.57471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882775.61136: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882775.61370: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882775.61419: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882775.61549: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882775.61583: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882775.61779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.61824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.61861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.61995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.62015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.62232: variable 'ansible_distribution_major_version' from source: facts 28173 1726882775.62254: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28173 1726882775.62286: when evaluation is False, skipping this task 28173 1726882775.62293: _execute() done 28173 1726882775.62301: dumping result to json 28173 1726882775.62395: done dumping result, returning 28173 1726882775.62407: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-0000000000bd] 28173 1726882775.62419: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000bd skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28173 1726882775.62587: no more pending results, returning what we have 28173 1726882775.62590: results queue empty 28173 1726882775.62591: checking for any_errors_fatal 28173 1726882775.62599: done checking for any_errors_fatal 28173 1726882775.62600: checking for max_fail_percentage 28173 1726882775.62602: done checking for max_fail_percentage 28173 1726882775.62603: checking to see if all hosts have failed and the running result is not ok 28173 1726882775.62604: done checking to see if all hosts have failed 28173 1726882775.62605: getting the remaining hosts for this loop 28173 1726882775.62607: done getting the remaining hosts for this loop 28173 1726882775.62611: getting the next task for host managed_node2 28173 1726882775.62617: done getting next task for host managed_node2 28173 1726882775.62622: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28173 1726882775.62624: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882775.62638: getting variables 28173 1726882775.62640: in VariableManager get_vars() 28173 1726882775.62688: Calling all_inventory to load vars for managed_node2 28173 1726882775.62691: Calling groups_inventory to load vars for managed_node2 28173 1726882775.62693: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882775.62705: Calling all_plugins_play to load vars for managed_node2 28173 1726882775.62708: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882775.62712: Calling groups_plugins_play to load vars for managed_node2 28173 1726882775.63686: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000bd 28173 1726882775.63689: WORKER PROCESS EXITING 28173 1726882775.65262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882775.68551: done with get_vars() 28173 1726882775.68590: done getting variables 28173 1726882775.68658: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:39:35 -0400 (0:00:00.128) 0:00:28.851 ****** 28173 1726882775.68698: entering _queue_task() for managed_node2/fail 28173 1726882775.69051: worker is 1 (out of 1 available) 28173 1726882775.69074: exiting _queue_task() for managed_node2/fail 28173 1726882775.69087: done queuing things up, now waiting for results queue to drain 28173 1726882775.69089: waiting for pending results... 28173 1726882775.69409: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28173 1726882775.69536: in run() - task 0e448fcc-3ce9-926c-8928-0000000000be 28173 1726882775.69557: variable 'ansible_search_path' from source: unknown 28173 1726882775.69571: variable 'ansible_search_path' from source: unknown 28173 1726882775.69619: calling self._execute() 28173 1726882775.69731: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882775.69744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882775.69762: variable 'omit' from source: magic vars 28173 1726882775.70831: variable 'ansible_distribution_major_version' from source: facts 28173 1726882775.70851: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882775.71104: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882775.71525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882775.77052: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882775.77237: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882775.77426: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882775.77468: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882775.77505: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882775.77685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.77835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.77872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.77919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.77944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.78090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.78118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.78261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.78312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.78331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.78384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.78498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.78528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.78577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.78709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.79008: variable 'network_connections' from source: play vars 28173 1726882775.79140: variable 'profile' from source: play vars 28173 1726882775.79221: variable 'profile' from source: play vars 28173 1726882775.79344: variable 'interface' from source: set_fact 28173 1726882775.79414: variable 'interface' from source: set_fact 28173 1726882775.79500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882775.79852: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882775.80033: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882775.80072: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882775.80218: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882775.80270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882775.80299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882775.80444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.80480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882775.80530: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882775.81011: variable 'network_connections' from source: play vars 28173 1726882775.81089: variable 'profile' from source: play vars 28173 1726882775.81152: variable 'profile' from source: play vars 28173 1726882775.81305: variable 'interface' from source: set_fact 28173 1726882775.81371: variable 'interface' from source: set_fact 28173 1726882775.81401: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28173 1726882775.81411: when evaluation is False, skipping this task 28173 1726882775.81418: _execute() done 28173 1726882775.81426: dumping result to json 28173 1726882775.81433: done dumping result, returning 28173 1726882775.81527: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-0000000000be] 28173 1726882775.81546: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000be skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28173 1726882775.81702: no more pending results, returning what we have 28173 1726882775.81706: results queue empty 28173 1726882775.81707: checking for any_errors_fatal 28173 1726882775.81715: done checking for any_errors_fatal 28173 1726882775.81715: checking for max_fail_percentage 28173 1726882775.81717: done checking for max_fail_percentage 28173 1726882775.81719: checking to see if all hosts have failed and the running result is not ok 28173 1726882775.81719: done checking to see if all hosts have failed 28173 1726882775.81720: getting the remaining hosts for this loop 28173 1726882775.81722: done getting the remaining hosts for this loop 28173 1726882775.81725: getting the next task for host managed_node2 28173 1726882775.81731: done getting next task for host managed_node2 28173 1726882775.81737: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28173 1726882775.81739: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882775.81753: getting variables 28173 1726882775.81754: in VariableManager get_vars() 28173 1726882775.81798: Calling all_inventory to load vars for managed_node2 28173 1726882775.81801: Calling groups_inventory to load vars for managed_node2 28173 1726882775.81803: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882775.81815: Calling all_plugins_play to load vars for managed_node2 28173 1726882775.81818: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882775.81821: Calling groups_plugins_play to load vars for managed_node2 28173 1726882775.83292: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000be 28173 1726882775.83295: WORKER PROCESS EXITING 28173 1726882775.84984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882775.88901: done with get_vars() 28173 1726882775.88928: done getting variables 28173 1726882775.88988: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:39:35 -0400 (0:00:00.203) 0:00:29.054 ****** 28173 1726882775.89018: entering _queue_task() for managed_node2/package 28173 1726882775.89334: worker is 1 (out of 1 available) 28173 1726882775.89347: exiting _queue_task() for managed_node2/package 28173 1726882775.89359: done queuing things up, now waiting for results queue to drain 28173 1726882775.89361: waiting for pending results... 28173 1726882775.90232: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28173 1726882775.90441: in run() - task 0e448fcc-3ce9-926c-8928-0000000000bf 28173 1726882775.90455: variable 'ansible_search_path' from source: unknown 28173 1726882775.90459: variable 'ansible_search_path' from source: unknown 28173 1726882775.90640: calling self._execute() 28173 1726882775.90811: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882775.90869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882775.90878: variable 'omit' from source: magic vars 28173 1726882775.91696: variable 'ansible_distribution_major_version' from source: facts 28173 1726882775.91711: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882775.92258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882775.92763: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882775.92816: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882775.92849: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882775.92913: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882775.93035: variable 'network_packages' from source: role '' defaults 28173 1726882775.93155: variable '__network_provider_setup' from source: role '' defaults 28173 1726882775.93171: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882775.93245: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882775.93250: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882775.93355: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882775.93551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882775.95804: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882775.95872: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882775.95912: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882775.95949: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882775.95978: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882775.96471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.96475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.96478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.96480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.96482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.96484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.96487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.96489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.96491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.96493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.96882: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28173 1726882775.96986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.97008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.97033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.97076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.97089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.97175: variable 'ansible_python' from source: facts 28173 1726882775.97202: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28173 1726882775.97273: variable '__network_wpa_supplicant_required' from source: role '' defaults 28173 1726882775.97355: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28173 1726882775.97716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.97741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.97768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.97806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.97932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.98052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882775.98078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882775.98101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.98138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882775.98270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882775.98513: variable 'network_connections' from source: play vars 28173 1726882775.98519: variable 'profile' from source: play vars 28173 1726882775.98941: variable 'profile' from source: play vars 28173 1726882775.98947: variable 'interface' from source: set_fact 28173 1726882775.99054: variable 'interface' from source: set_fact 28173 1726882775.99143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882775.99160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882775.99192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882775.99241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882775.99297: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882775.99584: variable 'network_connections' from source: play vars 28173 1726882775.99587: variable 'profile' from source: play vars 28173 1726882775.99686: variable 'profile' from source: play vars 28173 1726882775.99693: variable 'interface' from source: set_fact 28173 1726882775.99758: variable 'interface' from source: set_fact 28173 1726882775.99794: variable '__network_packages_default_wireless' from source: role '' defaults 28173 1726882775.99871: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882776.00178: variable 'network_connections' from source: play vars 28173 1726882776.00184: variable 'profile' from source: play vars 28173 1726882776.00251: variable 'profile' from source: play vars 28173 1726882776.00254: variable 'interface' from source: set_fact 28173 1726882776.00355: variable 'interface' from source: set_fact 28173 1726882776.00380: variable '__network_packages_default_team' from source: role '' defaults 28173 1726882776.00457: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882776.00769: variable 'network_connections' from source: play vars 28173 1726882776.00772: variable 'profile' from source: play vars 28173 1726882776.00832: variable 'profile' from source: play vars 28173 1726882776.00835: variable 'interface' from source: set_fact 28173 1726882776.00934: variable 'interface' from source: set_fact 28173 1726882776.00991: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882776.01047: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882776.01055: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882776.01119: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882776.01770: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28173 1726882776.02227: variable 'network_connections' from source: play vars 28173 1726882776.02230: variable 'profile' from source: play vars 28173 1726882776.02290: variable 'profile' from source: play vars 28173 1726882776.02294: variable 'interface' from source: set_fact 28173 1726882776.02355: variable 'interface' from source: set_fact 28173 1726882776.02363: variable 'ansible_distribution' from source: facts 28173 1726882776.02818: variable '__network_rh_distros' from source: role '' defaults 28173 1726882776.02826: variable 'ansible_distribution_major_version' from source: facts 28173 1726882776.02840: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28173 1726882776.03007: variable 'ansible_distribution' from source: facts 28173 1726882776.03012: variable '__network_rh_distros' from source: role '' defaults 28173 1726882776.03015: variable 'ansible_distribution_major_version' from source: facts 28173 1726882776.03029: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28173 1726882776.03191: variable 'ansible_distribution' from source: facts 28173 1726882776.03194: variable '__network_rh_distros' from source: role '' defaults 28173 1726882776.03199: variable 'ansible_distribution_major_version' from source: facts 28173 1726882776.03236: variable 'network_provider' from source: set_fact 28173 1726882776.03249: variable 'ansible_facts' from source: unknown 28173 1726882776.04183: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28173 1726882776.04186: when evaluation is False, skipping this task 28173 1726882776.04189: _execute() done 28173 1726882776.04191: dumping result to json 28173 1726882776.04193: done dumping result, returning 28173 1726882776.04201: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-926c-8928-0000000000bf] 28173 1726882776.04207: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000bf 28173 1726882776.04309: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000bf 28173 1726882776.04312: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28173 1726882776.04386: no more pending results, returning what we have 28173 1726882776.04389: results queue empty 28173 1726882776.04390: checking for any_errors_fatal 28173 1726882776.04400: done checking for any_errors_fatal 28173 1726882776.04401: checking for max_fail_percentage 28173 1726882776.04402: done checking for max_fail_percentage 28173 1726882776.04403: checking to see if all hosts have failed and the running result is not ok 28173 1726882776.04404: done checking to see if all hosts have failed 28173 1726882776.04405: getting the remaining hosts for this loop 28173 1726882776.04406: done getting the remaining hosts for this loop 28173 1726882776.04409: getting the next task for host managed_node2 28173 1726882776.04415: done getting next task for host managed_node2 28173 1726882776.04419: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28173 1726882776.04421: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882776.04436: getting variables 28173 1726882776.04437: in VariableManager get_vars() 28173 1726882776.04477: Calling all_inventory to load vars for managed_node2 28173 1726882776.04479: Calling groups_inventory to load vars for managed_node2 28173 1726882776.04481: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882776.04495: Calling all_plugins_play to load vars for managed_node2 28173 1726882776.04497: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882776.04500: Calling groups_plugins_play to load vars for managed_node2 28173 1726882776.06707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882776.08660: done with get_vars() 28173 1726882776.08689: done getting variables 28173 1726882776.08758: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:39:36 -0400 (0:00:00.197) 0:00:29.252 ****** 28173 1726882776.08792: entering _queue_task() for managed_node2/package 28173 1726882776.09130: worker is 1 (out of 1 available) 28173 1726882776.09150: exiting _queue_task() for managed_node2/package 28173 1726882776.09162: done queuing things up, now waiting for results queue to drain 28173 1726882776.09168: waiting for pending results... 28173 1726882776.09502: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28173 1726882776.09775: in run() - task 0e448fcc-3ce9-926c-8928-0000000000c0 28173 1726882776.09797: variable 'ansible_search_path' from source: unknown 28173 1726882776.09811: variable 'ansible_search_path' from source: unknown 28173 1726882776.09877: calling self._execute() 28173 1726882776.09996: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882776.10009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882776.10031: variable 'omit' from source: magic vars 28173 1726882776.10505: variable 'ansible_distribution_major_version' from source: facts 28173 1726882776.10557: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882776.10671: variable 'network_state' from source: role '' defaults 28173 1726882776.10679: Evaluated conditional (network_state != {}): False 28173 1726882776.10682: when evaluation is False, skipping this task 28173 1726882776.10685: _execute() done 28173 1726882776.10691: dumping result to json 28173 1726882776.10694: done dumping result, returning 28173 1726882776.10702: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-926c-8928-0000000000c0] 28173 1726882776.10708: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c0 28173 1726882776.10798: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c0 28173 1726882776.10800: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882776.10855: no more pending results, returning what we have 28173 1726882776.10859: results queue empty 28173 1726882776.10860: checking for any_errors_fatal 28173 1726882776.10870: done checking for any_errors_fatal 28173 1726882776.10871: checking for max_fail_percentage 28173 1726882776.10872: done checking for max_fail_percentage 28173 1726882776.10874: checking to see if all hosts have failed and the running result is not ok 28173 1726882776.10874: done checking to see if all hosts have failed 28173 1726882776.10875: getting the remaining hosts for this loop 28173 1726882776.10877: done getting the remaining hosts for this loop 28173 1726882776.10880: getting the next task for host managed_node2 28173 1726882776.10885: done getting next task for host managed_node2 28173 1726882776.10888: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28173 1726882776.10890: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882776.10905: getting variables 28173 1726882776.10906: in VariableManager get_vars() 28173 1726882776.10939: Calling all_inventory to load vars for managed_node2 28173 1726882776.10941: Calling groups_inventory to load vars for managed_node2 28173 1726882776.10943: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882776.10952: Calling all_plugins_play to load vars for managed_node2 28173 1726882776.10954: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882776.10957: Calling groups_plugins_play to load vars for managed_node2 28173 1726882776.11836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882776.13828: done with get_vars() 28173 1726882776.13845: done getting variables 28173 1726882776.13895: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:39:36 -0400 (0:00:00.051) 0:00:29.303 ****** 28173 1726882776.13923: entering _queue_task() for managed_node2/package 28173 1726882776.14122: worker is 1 (out of 1 available) 28173 1726882776.14136: exiting _queue_task() for managed_node2/package 28173 1726882776.14147: done queuing things up, now waiting for results queue to drain 28173 1726882776.14148: waiting for pending results... 28173 1726882776.14331: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28173 1726882776.14398: in run() - task 0e448fcc-3ce9-926c-8928-0000000000c1 28173 1726882776.14409: variable 'ansible_search_path' from source: unknown 28173 1726882776.14413: variable 'ansible_search_path' from source: unknown 28173 1726882776.14445: calling self._execute() 28173 1726882776.14519: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882776.14523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882776.14531: variable 'omit' from source: magic vars 28173 1726882776.14810: variable 'ansible_distribution_major_version' from source: facts 28173 1726882776.14821: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882776.14921: variable 'network_state' from source: role '' defaults 28173 1726882776.14930: Evaluated conditional (network_state != {}): False 28173 1726882776.14933: when evaluation is False, skipping this task 28173 1726882776.14936: _execute() done 28173 1726882776.14938: dumping result to json 28173 1726882776.15206: done dumping result, returning 28173 1726882776.15209: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-926c-8928-0000000000c1] 28173 1726882776.15212: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c1 28173 1726882776.15278: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c1 28173 1726882776.15281: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882776.15318: no more pending results, returning what we have 28173 1726882776.15321: results queue empty 28173 1726882776.15321: checking for any_errors_fatal 28173 1726882776.15328: done checking for any_errors_fatal 28173 1726882776.15329: checking for max_fail_percentage 28173 1726882776.15331: done checking for max_fail_percentage 28173 1726882776.15332: checking to see if all hosts have failed and the running result is not ok 28173 1726882776.15333: done checking to see if all hosts have failed 28173 1726882776.15334: getting the remaining hosts for this loop 28173 1726882776.15335: done getting the remaining hosts for this loop 28173 1726882776.15339: getting the next task for host managed_node2 28173 1726882776.15343: done getting next task for host managed_node2 28173 1726882776.15348: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28173 1726882776.15350: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882776.15362: getting variables 28173 1726882776.15368: in VariableManager get_vars() 28173 1726882776.15401: Calling all_inventory to load vars for managed_node2 28173 1726882776.15404: Calling groups_inventory to load vars for managed_node2 28173 1726882776.15406: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882776.15416: Calling all_plugins_play to load vars for managed_node2 28173 1726882776.15419: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882776.15422: Calling groups_plugins_play to load vars for managed_node2 28173 1726882776.16916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882776.17902: done with get_vars() 28173 1726882776.17917: done getting variables 28173 1726882776.17956: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:39:36 -0400 (0:00:00.040) 0:00:29.344 ****** 28173 1726882776.17985: entering _queue_task() for managed_node2/service 28173 1726882776.18172: worker is 1 (out of 1 available) 28173 1726882776.18186: exiting _queue_task() for managed_node2/service 28173 1726882776.18196: done queuing things up, now waiting for results queue to drain 28173 1726882776.18197: waiting for pending results... 28173 1726882776.18373: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28173 1726882776.18446: in run() - task 0e448fcc-3ce9-926c-8928-0000000000c2 28173 1726882776.18458: variable 'ansible_search_path' from source: unknown 28173 1726882776.18462: variable 'ansible_search_path' from source: unknown 28173 1726882776.18493: calling self._execute() 28173 1726882776.18566: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882776.18574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882776.18581: variable 'omit' from source: magic vars 28173 1726882776.18895: variable 'ansible_distribution_major_version' from source: facts 28173 1726882776.18906: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882776.19006: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882776.19571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882776.21967: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882776.22034: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882776.22098: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882776.22107: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882776.22132: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882776.22212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882776.22252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882776.22284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.22325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882776.22338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882776.22387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882776.22409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882776.22434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.22478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882776.22493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882776.22532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882776.22555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882776.22749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.22793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882776.22808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882776.22989: variable 'network_connections' from source: play vars 28173 1726882776.22999: variable 'profile' from source: play vars 28173 1726882776.23074: variable 'profile' from source: play vars 28173 1726882776.23080: variable 'interface' from source: set_fact 28173 1726882776.23139: variable 'interface' from source: set_fact 28173 1726882776.23218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882776.23377: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882776.23503: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882776.23506: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882776.23509: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882776.23528: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882776.23636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882776.23639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.23641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882776.23644: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882776.24096: variable 'network_connections' from source: play vars 28173 1726882776.24106: variable 'profile' from source: play vars 28173 1726882776.24175: variable 'profile' from source: play vars 28173 1726882776.24187: variable 'interface' from source: set_fact 28173 1726882776.24256: variable 'interface' from source: set_fact 28173 1726882776.24291: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28173 1726882776.24307: when evaluation is False, skipping this task 28173 1726882776.24315: _execute() done 28173 1726882776.24321: dumping result to json 28173 1726882776.24328: done dumping result, returning 28173 1726882776.24338: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-0000000000c2] 28173 1726882776.24353: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c2 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28173 1726882776.24501: no more pending results, returning what we have 28173 1726882776.24504: results queue empty 28173 1726882776.24505: checking for any_errors_fatal 28173 1726882776.24511: done checking for any_errors_fatal 28173 1726882776.24512: checking for max_fail_percentage 28173 1726882776.24514: done checking for max_fail_percentage 28173 1726882776.24515: checking to see if all hosts have failed and the running result is not ok 28173 1726882776.24516: done checking to see if all hosts have failed 28173 1726882776.24516: getting the remaining hosts for this loop 28173 1726882776.24518: done getting the remaining hosts for this loop 28173 1726882776.24522: getting the next task for host managed_node2 28173 1726882776.24528: done getting next task for host managed_node2 28173 1726882776.24532: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28173 1726882776.24534: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882776.24547: getting variables 28173 1726882776.24549: in VariableManager get_vars() 28173 1726882776.24589: Calling all_inventory to load vars for managed_node2 28173 1726882776.24591: Calling groups_inventory to load vars for managed_node2 28173 1726882776.24594: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882776.24605: Calling all_plugins_play to load vars for managed_node2 28173 1726882776.24608: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882776.24611: Calling groups_plugins_play to load vars for managed_node2 28173 1726882776.25477: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c2 28173 1726882776.25481: WORKER PROCESS EXITING 28173 1726882776.25955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882776.26922: done with get_vars() 28173 1726882776.26939: done getting variables 28173 1726882776.26989: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:39:36 -0400 (0:00:00.090) 0:00:29.434 ****** 28173 1726882776.27011: entering _queue_task() for managed_node2/service 28173 1726882776.27223: worker is 1 (out of 1 available) 28173 1726882776.27237: exiting _queue_task() for managed_node2/service 28173 1726882776.27250: done queuing things up, now waiting for results queue to drain 28173 1726882776.27251: waiting for pending results... 28173 1726882776.27436: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28173 1726882776.27508: in run() - task 0e448fcc-3ce9-926c-8928-0000000000c3 28173 1726882776.27523: variable 'ansible_search_path' from source: unknown 28173 1726882776.27526: variable 'ansible_search_path' from source: unknown 28173 1726882776.27555: calling self._execute() 28173 1726882776.27632: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882776.27637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882776.27646: variable 'omit' from source: magic vars 28173 1726882776.27947: variable 'ansible_distribution_major_version' from source: facts 28173 1726882776.27958: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882776.28071: variable 'network_provider' from source: set_fact 28173 1726882776.28076: variable 'network_state' from source: role '' defaults 28173 1726882776.28082: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28173 1726882776.28091: variable 'omit' from source: magic vars 28173 1726882776.28114: variable 'omit' from source: magic vars 28173 1726882776.28136: variable 'network_service_name' from source: role '' defaults 28173 1726882776.28197: variable 'network_service_name' from source: role '' defaults 28173 1726882776.28274: variable '__network_provider_setup' from source: role '' defaults 28173 1726882776.28280: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882776.28326: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882776.28332: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882776.28379: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882776.28524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882776.30681: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882776.30727: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882776.30754: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882776.30789: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882776.30812: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882776.30870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882776.30889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882776.30908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.30936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882776.30946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882776.30978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882776.30994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882776.31011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.31039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882776.31049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882776.31199: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28173 1726882776.31274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882776.31291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882776.31307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.31332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882776.31345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882776.31405: variable 'ansible_python' from source: facts 28173 1726882776.31421: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28173 1726882776.31480: variable '__network_wpa_supplicant_required' from source: role '' defaults 28173 1726882776.31533: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28173 1726882776.31616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882776.31633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882776.31649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.31682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882776.31692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882776.31723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882776.31742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882776.31759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.31791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882776.31800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882776.31890: variable 'network_connections' from source: play vars 28173 1726882776.31895: variable 'profile' from source: play vars 28173 1726882776.31946: variable 'profile' from source: play vars 28173 1726882776.31949: variable 'interface' from source: set_fact 28173 1726882776.32032: variable 'interface' from source: set_fact 28173 1726882776.32125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882776.32706: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882776.32751: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882776.32795: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882776.32845: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882776.32903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882776.32932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882776.32959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.32992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882776.33035: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882776.33298: variable 'network_connections' from source: play vars 28173 1726882776.33303: variable 'profile' from source: play vars 28173 1726882776.33376: variable 'profile' from source: play vars 28173 1726882776.33379: variable 'interface' from source: set_fact 28173 1726882776.33436: variable 'interface' from source: set_fact 28173 1726882776.33470: variable '__network_packages_default_wireless' from source: role '' defaults 28173 1726882776.33543: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882776.33818: variable 'network_connections' from source: play vars 28173 1726882776.33821: variable 'profile' from source: play vars 28173 1726882776.33891: variable 'profile' from source: play vars 28173 1726882776.33895: variable 'interface' from source: set_fact 28173 1726882776.33969: variable 'interface' from source: set_fact 28173 1726882776.33990: variable '__network_packages_default_team' from source: role '' defaults 28173 1726882776.34068: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882776.34337: variable 'network_connections' from source: play vars 28173 1726882776.34340: variable 'profile' from source: play vars 28173 1726882776.34409: variable 'profile' from source: play vars 28173 1726882776.34412: variable 'interface' from source: set_fact 28173 1726882776.34482: variable 'interface' from source: set_fact 28173 1726882776.34534: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882776.34592: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882776.34598: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882776.34656: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882776.34871: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28173 1726882776.35332: variable 'network_connections' from source: play vars 28173 1726882776.35335: variable 'profile' from source: play vars 28173 1726882776.35394: variable 'profile' from source: play vars 28173 1726882776.35397: variable 'interface' from source: set_fact 28173 1726882776.35462: variable 'interface' from source: set_fact 28173 1726882776.35471: variable 'ansible_distribution' from source: facts 28173 1726882776.35474: variable '__network_rh_distros' from source: role '' defaults 28173 1726882776.35483: variable 'ansible_distribution_major_version' from source: facts 28173 1726882776.35495: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28173 1726882776.35659: variable 'ansible_distribution' from source: facts 28173 1726882776.35663: variable '__network_rh_distros' from source: role '' defaults 28173 1726882776.35671: variable 'ansible_distribution_major_version' from source: facts 28173 1726882776.35683: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28173 1726882776.35848: variable 'ansible_distribution' from source: facts 28173 1726882776.35852: variable '__network_rh_distros' from source: role '' defaults 28173 1726882776.35856: variable 'ansible_distribution_major_version' from source: facts 28173 1726882776.35894: variable 'network_provider' from source: set_fact 28173 1726882776.35914: variable 'omit' from source: magic vars 28173 1726882776.35939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882776.35968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882776.35983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882776.36000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882776.36009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882776.36036: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882776.36040: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882776.36042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882776.36136: Set connection var ansible_pipelining to False 28173 1726882776.36142: Set connection var ansible_shell_type to sh 28173 1726882776.36151: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882776.36159: Set connection var ansible_timeout to 10 28173 1726882776.36169: Set connection var ansible_shell_executable to /bin/sh 28173 1726882776.36172: Set connection var ansible_connection to ssh 28173 1726882776.36198: variable 'ansible_shell_executable' from source: unknown 28173 1726882776.36201: variable 'ansible_connection' from source: unknown 28173 1726882776.36203: variable 'ansible_module_compression' from source: unknown 28173 1726882776.36206: variable 'ansible_shell_type' from source: unknown 28173 1726882776.36208: variable 'ansible_shell_executable' from source: unknown 28173 1726882776.36210: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882776.36215: variable 'ansible_pipelining' from source: unknown 28173 1726882776.36218: variable 'ansible_timeout' from source: unknown 28173 1726882776.36220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882776.36332: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882776.36341: variable 'omit' from source: magic vars 28173 1726882776.36346: starting attempt loop 28173 1726882776.36349: running the handler 28173 1726882776.36426: variable 'ansible_facts' from source: unknown 28173 1726882776.37216: _low_level_execute_command(): starting 28173 1726882776.37222: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882776.37922: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882776.37933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882776.37944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882776.37958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882776.37999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882776.38006: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882776.38017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882776.38030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882776.38039: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882776.38044: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882776.38052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882776.38061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882776.38077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882776.38083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882776.38095: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882776.38108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882776.38182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882776.38206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882776.38223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882776.38372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882776.40020: stdout chunk (state=3): >>>/root <<< 28173 1726882776.40210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882776.40214: stdout chunk (state=3): >>><<< 28173 1726882776.40216: stderr chunk (state=3): >>><<< 28173 1726882776.40299: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882776.40307: _low_level_execute_command(): starting 28173 1726882776.40312: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882776.4025917-29445-243571336486494 `" && echo ansible-tmp-1726882776.4025917-29445-243571336486494="` echo /root/.ansible/tmp/ansible-tmp-1726882776.4025917-29445-243571336486494 `" ) && sleep 0' 28173 1726882776.41037: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882776.41053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882776.41081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882776.41107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882776.41161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882776.41185: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882776.41213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882776.41234: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882776.41246: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882776.41258: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882776.41274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882776.41297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882776.41325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882776.41341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882776.41353: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882776.41369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882776.41459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882776.41484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882776.41503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882776.41640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882776.43517: stdout chunk (state=3): >>>ansible-tmp-1726882776.4025917-29445-243571336486494=/root/.ansible/tmp/ansible-tmp-1726882776.4025917-29445-243571336486494 <<< 28173 1726882776.43699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882776.43703: stdout chunk (state=3): >>><<< 28173 1726882776.43706: stderr chunk (state=3): >>><<< 28173 1726882776.44074: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882776.4025917-29445-243571336486494=/root/.ansible/tmp/ansible-tmp-1726882776.4025917-29445-243571336486494 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882776.44081: variable 'ansible_module_compression' from source: unknown 28173 1726882776.44084: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28173 1726882776.44086: variable 'ansible_facts' from source: unknown 28173 1726882776.44142: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882776.4025917-29445-243571336486494/AnsiballZ_systemd.py 28173 1726882776.44312: Sending initial data 28173 1726882776.44316: Sent initial data (156 bytes) 28173 1726882776.45284: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882776.45297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882776.45310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882776.45326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882776.45367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882776.45384: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882776.45397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882776.45413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882776.45423: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882776.45433: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882776.45443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882776.45454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882776.45471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882776.45483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882776.45498: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882776.45510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882776.45588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882776.45625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882776.45641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882776.45769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882776.47500: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882776.47604: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882776.47705: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp6r8irwgo /root/.ansible/tmp/ansible-tmp-1726882776.4025917-29445-243571336486494/AnsiballZ_systemd.py <<< 28173 1726882776.47799: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882776.51220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882776.51432: stderr chunk (state=3): >>><<< 28173 1726882776.51436: stdout chunk (state=3): >>><<< 28173 1726882776.51438: done transferring module to remote 28173 1726882776.51440: _low_level_execute_command(): starting 28173 1726882776.51443: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882776.4025917-29445-243571336486494/ /root/.ansible/tmp/ansible-tmp-1726882776.4025917-29445-243571336486494/AnsiballZ_systemd.py && sleep 0' 28173 1726882776.51992: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882776.52005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882776.52018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882776.52035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882776.52082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882776.52094: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882776.52106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882776.52122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882776.52132: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882776.52142: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882776.52152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882776.52166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882776.52182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882776.52196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882776.52206: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882776.52218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882776.52296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882776.52317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882776.52331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882776.52454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882776.54250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882776.54337: stderr chunk (state=3): >>><<< 28173 1726882776.54423: stdout chunk (state=3): >>><<< 28173 1726882776.54468: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882776.54475: _low_level_execute_command(): starting 28173 1726882776.54478: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882776.4025917-29445-243571336486494/AnsiballZ_systemd.py && sleep 0' 28173 1726882776.55566: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882776.55686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882776.55702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882776.55722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882776.55766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882776.56022: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882776.56036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882776.56052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882776.56065: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882776.56077: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882776.56089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882776.56104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882776.56119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882776.56130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882776.56140: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882776.56152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882776.56228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882776.56252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882776.56271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882776.56413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882776.81368: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9211904", "MemoryAvailable": "infinity", "CPUUsageNSec": "1981594000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft"<<< 28173 1726882776.81407: stdout chunk (state=3): >>>: "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28173 1726882776.82926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882776.82976: stderr chunk (state=3): >>><<< 28173 1726882776.82980: stdout chunk (state=3): >>><<< 28173 1726882776.82995: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9211904", "MemoryAvailable": "infinity", "CPUUsageNSec": "1981594000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882776.83103: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882776.4025917-29445-243571336486494/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882776.83116: _low_level_execute_command(): starting 28173 1726882776.83121: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882776.4025917-29445-243571336486494/ > /dev/null 2>&1 && sleep 0' 28173 1726882776.83533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882776.83538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882776.83569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882776.83582: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882776.83592: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882776.83643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882776.83649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882776.83761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882776.85569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882776.85610: stderr chunk (state=3): >>><<< 28173 1726882776.85614: stdout chunk (state=3): >>><<< 28173 1726882776.85624: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882776.85630: handler run complete 28173 1726882776.85669: attempt loop complete, returning result 28173 1726882776.85673: _execute() done 28173 1726882776.85676: dumping result to json 28173 1726882776.85692: done dumping result, returning 28173 1726882776.85700: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-926c-8928-0000000000c3] 28173 1726882776.85705: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c3 28173 1726882776.85889: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c3 28173 1726882776.85892: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882776.85942: no more pending results, returning what we have 28173 1726882776.85945: results queue empty 28173 1726882776.85946: checking for any_errors_fatal 28173 1726882776.85955: done checking for any_errors_fatal 28173 1726882776.85956: checking for max_fail_percentage 28173 1726882776.85958: done checking for max_fail_percentage 28173 1726882776.85959: checking to see if all hosts have failed and the running result is not ok 28173 1726882776.85960: done checking to see if all hosts have failed 28173 1726882776.85960: getting the remaining hosts for this loop 28173 1726882776.85962: done getting the remaining hosts for this loop 28173 1726882776.85970: getting the next task for host managed_node2 28173 1726882776.85976: done getting next task for host managed_node2 28173 1726882776.85979: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28173 1726882776.85981: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882776.85991: getting variables 28173 1726882776.85992: in VariableManager get_vars() 28173 1726882776.86031: Calling all_inventory to load vars for managed_node2 28173 1726882776.86034: Calling groups_inventory to load vars for managed_node2 28173 1726882776.86036: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882776.86045: Calling all_plugins_play to load vars for managed_node2 28173 1726882776.86047: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882776.86049: Calling groups_plugins_play to load vars for managed_node2 28173 1726882776.86901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882776.87871: done with get_vars() 28173 1726882776.87889: done getting variables 28173 1726882776.87931: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:39:36 -0400 (0:00:00.609) 0:00:30.043 ****** 28173 1726882776.87953: entering _queue_task() for managed_node2/service 28173 1726882776.88153: worker is 1 (out of 1 available) 28173 1726882776.88168: exiting _queue_task() for managed_node2/service 28173 1726882776.88181: done queuing things up, now waiting for results queue to drain 28173 1726882776.88182: waiting for pending results... 28173 1726882776.88373: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28173 1726882776.88446: in run() - task 0e448fcc-3ce9-926c-8928-0000000000c4 28173 1726882776.88458: variable 'ansible_search_path' from source: unknown 28173 1726882776.88462: variable 'ansible_search_path' from source: unknown 28173 1726882776.88492: calling self._execute() 28173 1726882776.88568: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882776.88572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882776.88579: variable 'omit' from source: magic vars 28173 1726882776.88852: variable 'ansible_distribution_major_version' from source: facts 28173 1726882776.88864: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882776.88945: variable 'network_provider' from source: set_fact 28173 1726882776.88950: Evaluated conditional (network_provider == "nm"): True 28173 1726882776.89011: variable '__network_wpa_supplicant_required' from source: role '' defaults 28173 1726882776.89075: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28173 1726882776.89206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882776.94785: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882776.94827: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882776.94853: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882776.94880: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882776.94899: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882776.94959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882776.94982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882776.94998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.95024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882776.95037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882776.95067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882776.95086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882776.95103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.95127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882776.95143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882776.95170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882776.95187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882776.95203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.95227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882776.95237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882776.95331: variable 'network_connections' from source: play vars 28173 1726882776.95340: variable 'profile' from source: play vars 28173 1726882776.95394: variable 'profile' from source: play vars 28173 1726882776.95397: variable 'interface' from source: set_fact 28173 1726882776.95440: variable 'interface' from source: set_fact 28173 1726882776.95493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882776.95597: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882776.95623: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882776.95643: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882776.95664: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882776.95699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882776.95715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882776.95731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882776.95748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882776.95779: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882776.95935: variable 'network_connections' from source: play vars 28173 1726882776.95938: variable 'profile' from source: play vars 28173 1726882776.95983: variable 'profile' from source: play vars 28173 1726882776.95987: variable 'interface' from source: set_fact 28173 1726882776.96030: variable 'interface' from source: set_fact 28173 1726882776.96050: Evaluated conditional (__network_wpa_supplicant_required): False 28173 1726882776.96053: when evaluation is False, skipping this task 28173 1726882776.96056: _execute() done 28173 1726882776.96068: dumping result to json 28173 1726882776.96070: done dumping result, returning 28173 1726882776.96073: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-926c-8928-0000000000c4] 28173 1726882776.96075: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c4 28173 1726882776.96155: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c4 28173 1726882776.96158: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28173 1726882776.96203: no more pending results, returning what we have 28173 1726882776.96206: results queue empty 28173 1726882776.96206: checking for any_errors_fatal 28173 1726882776.96222: done checking for any_errors_fatal 28173 1726882776.96222: checking for max_fail_percentage 28173 1726882776.96224: done checking for max_fail_percentage 28173 1726882776.96225: checking to see if all hosts have failed and the running result is not ok 28173 1726882776.96226: done checking to see if all hosts have failed 28173 1726882776.96227: getting the remaining hosts for this loop 28173 1726882776.96228: done getting the remaining hosts for this loop 28173 1726882776.96231: getting the next task for host managed_node2 28173 1726882776.96236: done getting next task for host managed_node2 28173 1726882776.96240: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28173 1726882776.96241: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882776.96253: getting variables 28173 1726882776.96255: in VariableManager get_vars() 28173 1726882776.96300: Calling all_inventory to load vars for managed_node2 28173 1726882776.96302: Calling groups_inventory to load vars for managed_node2 28173 1726882776.96304: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882776.96313: Calling all_plugins_play to load vars for managed_node2 28173 1726882776.96316: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882776.96318: Calling groups_plugins_play to load vars for managed_node2 28173 1726882777.00840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882777.01791: done with get_vars() 28173 1726882777.01807: done getting variables 28173 1726882777.01839: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:39:37 -0400 (0:00:00.139) 0:00:30.182 ****** 28173 1726882777.01856: entering _queue_task() for managed_node2/service 28173 1726882777.02119: worker is 1 (out of 1 available) 28173 1726882777.02131: exiting _queue_task() for managed_node2/service 28173 1726882777.02142: done queuing things up, now waiting for results queue to drain 28173 1726882777.02143: waiting for pending results... 28173 1726882777.02379: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28173 1726882777.02502: in run() - task 0e448fcc-3ce9-926c-8928-0000000000c5 28173 1726882777.02523: variable 'ansible_search_path' from source: unknown 28173 1726882777.02532: variable 'ansible_search_path' from source: unknown 28173 1726882777.02576: calling self._execute() 28173 1726882777.02686: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882777.02697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882777.02716: variable 'omit' from source: magic vars 28173 1726882777.03121: variable 'ansible_distribution_major_version' from source: facts 28173 1726882777.03140: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882777.03251: variable 'network_provider' from source: set_fact 28173 1726882777.03264: Evaluated conditional (network_provider == "initscripts"): False 28173 1726882777.03272: when evaluation is False, skipping this task 28173 1726882777.03277: _execute() done 28173 1726882777.03283: dumping result to json 28173 1726882777.03288: done dumping result, returning 28173 1726882777.03296: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-926c-8928-0000000000c5] 28173 1726882777.03304: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c5 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882777.03438: no more pending results, returning what we have 28173 1726882777.03442: results queue empty 28173 1726882777.03442: checking for any_errors_fatal 28173 1726882777.03456: done checking for any_errors_fatal 28173 1726882777.03457: checking for max_fail_percentage 28173 1726882777.03458: done checking for max_fail_percentage 28173 1726882777.03459: checking to see if all hosts have failed and the running result is not ok 28173 1726882777.03460: done checking to see if all hosts have failed 28173 1726882777.03461: getting the remaining hosts for this loop 28173 1726882777.03465: done getting the remaining hosts for this loop 28173 1726882777.03468: getting the next task for host managed_node2 28173 1726882777.03476: done getting next task for host managed_node2 28173 1726882777.03480: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28173 1726882777.03483: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882777.03499: getting variables 28173 1726882777.03501: in VariableManager get_vars() 28173 1726882777.03539: Calling all_inventory to load vars for managed_node2 28173 1726882777.03541: Calling groups_inventory to load vars for managed_node2 28173 1726882777.03544: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882777.03555: Calling all_plugins_play to load vars for managed_node2 28173 1726882777.03559: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882777.03562: Calling groups_plugins_play to load vars for managed_node2 28173 1726882777.04735: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c5 28173 1726882777.04738: WORKER PROCESS EXITING 28173 1726882777.05359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882777.07406: done with get_vars() 28173 1726882777.07429: done getting variables 28173 1726882777.07487: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:39:37 -0400 (0:00:00.056) 0:00:30.239 ****** 28173 1726882777.07516: entering _queue_task() for managed_node2/copy 28173 1726882777.07787: worker is 1 (out of 1 available) 28173 1726882777.07798: exiting _queue_task() for managed_node2/copy 28173 1726882777.07810: done queuing things up, now waiting for results queue to drain 28173 1726882777.07811: waiting for pending results... 28173 1726882777.08088: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28173 1726882777.08224: in run() - task 0e448fcc-3ce9-926c-8928-0000000000c6 28173 1726882777.08246: variable 'ansible_search_path' from source: unknown 28173 1726882777.08259: variable 'ansible_search_path' from source: unknown 28173 1726882777.08305: calling self._execute() 28173 1726882777.08423: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882777.08434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882777.08448: variable 'omit' from source: magic vars 28173 1726882777.09002: variable 'ansible_distribution_major_version' from source: facts 28173 1726882777.09022: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882777.09151: variable 'network_provider' from source: set_fact 28173 1726882777.09160: Evaluated conditional (network_provider == "initscripts"): False 28173 1726882777.09169: when evaluation is False, skipping this task 28173 1726882777.09175: _execute() done 28173 1726882777.09180: dumping result to json 28173 1726882777.09184: done dumping result, returning 28173 1726882777.09193: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-926c-8928-0000000000c6] 28173 1726882777.09202: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c6 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28173 1726882777.09340: no more pending results, returning what we have 28173 1726882777.09344: results queue empty 28173 1726882777.09344: checking for any_errors_fatal 28173 1726882777.09350: done checking for any_errors_fatal 28173 1726882777.09351: checking for max_fail_percentage 28173 1726882777.09353: done checking for max_fail_percentage 28173 1726882777.09354: checking to see if all hosts have failed and the running result is not ok 28173 1726882777.09355: done checking to see if all hosts have failed 28173 1726882777.09355: getting the remaining hosts for this loop 28173 1726882777.09357: done getting the remaining hosts for this loop 28173 1726882777.09360: getting the next task for host managed_node2 28173 1726882777.09368: done getting next task for host managed_node2 28173 1726882777.09372: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28173 1726882777.09374: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882777.09388: getting variables 28173 1726882777.09390: in VariableManager get_vars() 28173 1726882777.09425: Calling all_inventory to load vars for managed_node2 28173 1726882777.09427: Calling groups_inventory to load vars for managed_node2 28173 1726882777.09430: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882777.09441: Calling all_plugins_play to load vars for managed_node2 28173 1726882777.09443: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882777.09446: Calling groups_plugins_play to load vars for managed_node2 28173 1726882777.10482: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c6 28173 1726882777.10486: WORKER PROCESS EXITING 28173 1726882777.11132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882777.12891: done with get_vars() 28173 1726882777.12920: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:39:37 -0400 (0:00:00.054) 0:00:30.294 ****** 28173 1726882777.13003: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28173 1726882777.13290: worker is 1 (out of 1 available) 28173 1726882777.13302: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28173 1726882777.13314: done queuing things up, now waiting for results queue to drain 28173 1726882777.13315: waiting for pending results... 28173 1726882777.13600: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28173 1726882777.13728: in run() - task 0e448fcc-3ce9-926c-8928-0000000000c7 28173 1726882777.13748: variable 'ansible_search_path' from source: unknown 28173 1726882777.13759: variable 'ansible_search_path' from source: unknown 28173 1726882777.13802: calling self._execute() 28173 1726882777.13908: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882777.13918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882777.13930: variable 'omit' from source: magic vars 28173 1726882777.14318: variable 'ansible_distribution_major_version' from source: facts 28173 1726882777.14335: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882777.14346: variable 'omit' from source: magic vars 28173 1726882777.14388: variable 'omit' from source: magic vars 28173 1726882777.14552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882777.17168: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882777.17236: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882777.17287: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882777.17326: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882777.17359: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882777.17436: variable 'network_provider' from source: set_fact 28173 1726882777.17566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882777.17598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882777.17626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882777.17675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882777.17693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882777.17769: variable 'omit' from source: magic vars 28173 1726882777.17887: variable 'omit' from source: magic vars 28173 1726882777.17995: variable 'network_connections' from source: play vars 28173 1726882777.18010: variable 'profile' from source: play vars 28173 1726882777.18079: variable 'profile' from source: play vars 28173 1726882777.18088: variable 'interface' from source: set_fact 28173 1726882777.18153: variable 'interface' from source: set_fact 28173 1726882777.18300: variable 'omit' from source: magic vars 28173 1726882777.18317: variable '__lsr_ansible_managed' from source: task vars 28173 1726882777.18379: variable '__lsr_ansible_managed' from source: task vars 28173 1726882777.18649: Loaded config def from plugin (lookup/template) 28173 1726882777.18658: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28173 1726882777.18692: File lookup term: get_ansible_managed.j2 28173 1726882777.18700: variable 'ansible_search_path' from source: unknown 28173 1726882777.18707: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28173 1726882777.18722: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28173 1726882777.18742: variable 'ansible_search_path' from source: unknown 28173 1726882777.28157: variable 'ansible_managed' from source: unknown 28173 1726882777.28571: variable 'omit' from source: magic vars 28173 1726882777.28699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882777.28733: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882777.28835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882777.28857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882777.28934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882777.28972: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882777.28981: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882777.28990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882777.29205: Set connection var ansible_pipelining to False 28173 1726882777.29212: Set connection var ansible_shell_type to sh 28173 1726882777.29224: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882777.29234: Set connection var ansible_timeout to 10 28173 1726882777.29259: Set connection var ansible_shell_executable to /bin/sh 28173 1726882777.29326: Set connection var ansible_connection to ssh 28173 1726882777.29351: variable 'ansible_shell_executable' from source: unknown 28173 1726882777.29361: variable 'ansible_connection' from source: unknown 28173 1726882777.29371: variable 'ansible_module_compression' from source: unknown 28173 1726882777.29378: variable 'ansible_shell_type' from source: unknown 28173 1726882777.29384: variable 'ansible_shell_executable' from source: unknown 28173 1726882777.29391: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882777.29398: variable 'ansible_pipelining' from source: unknown 28173 1726882777.29403: variable 'ansible_timeout' from source: unknown 28173 1726882777.29409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882777.29540: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882777.29560: variable 'omit' from source: magic vars 28173 1726882777.29576: starting attempt loop 28173 1726882777.29585: running the handler 28173 1726882777.29599: _low_level_execute_command(): starting 28173 1726882777.29609: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882777.30586: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882777.30602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882777.30618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882777.30636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882777.30686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882777.30700: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882777.30713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882777.30729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882777.30758: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882777.30771: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882777.30787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882777.30800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882777.30817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882777.30831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882777.30843: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882777.30858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882777.30938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882777.30959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882777.30979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882777.31123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882777.32784: stdout chunk (state=3): >>>/root <<< 28173 1726882777.32962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882777.32968: stdout chunk (state=3): >>><<< 28173 1726882777.32970: stderr chunk (state=3): >>><<< 28173 1726882777.33071: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882777.33075: _low_level_execute_command(): starting 28173 1726882777.33079: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882777.3299146-29479-279532859499998 `" && echo ansible-tmp-1726882777.3299146-29479-279532859499998="` echo /root/.ansible/tmp/ansible-tmp-1726882777.3299146-29479-279532859499998 `" ) && sleep 0' 28173 1726882777.33809: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882777.33819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882777.33830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882777.33843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882777.33892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882777.33900: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882777.33910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882777.33924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882777.33932: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882777.33939: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882777.33946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882777.33956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882777.33972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882777.33987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882777.33994: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882777.34004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882777.34081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882777.34102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882777.34112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882777.34320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882777.36205: stdout chunk (state=3): >>>ansible-tmp-1726882777.3299146-29479-279532859499998=/root/.ansible/tmp/ansible-tmp-1726882777.3299146-29479-279532859499998 <<< 28173 1726882777.36320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882777.36402: stderr chunk (state=3): >>><<< 28173 1726882777.36405: stdout chunk (state=3): >>><<< 28173 1726882777.36747: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882777.3299146-29479-279532859499998=/root/.ansible/tmp/ansible-tmp-1726882777.3299146-29479-279532859499998 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882777.36753: variable 'ansible_module_compression' from source: unknown 28173 1726882777.36755: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28173 1726882777.36757: variable 'ansible_facts' from source: unknown 28173 1726882777.36759: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882777.3299146-29479-279532859499998/AnsiballZ_network_connections.py 28173 1726882777.36826: Sending initial data 28173 1726882777.36829: Sent initial data (168 bytes) 28173 1726882777.37865: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882777.37884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882777.37898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882777.37920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882777.37970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882777.37986: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882777.38000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882777.38016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882777.38032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882777.38051: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882777.38068: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882777.38084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882777.38105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882777.38119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882777.38136: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882777.38151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882777.38238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882777.38259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882777.38283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882777.38412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882777.40152: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882777.40250: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882777.40351: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpyol2aob5 /root/.ansible/tmp/ansible-tmp-1726882777.3299146-29479-279532859499998/AnsiballZ_network_connections.py <<< 28173 1726882777.40442: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882777.42736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882777.42868: stderr chunk (state=3): >>><<< 28173 1726882777.42872: stdout chunk (state=3): >>><<< 28173 1726882777.42874: done transferring module to remote 28173 1726882777.42876: _low_level_execute_command(): starting 28173 1726882777.42878: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882777.3299146-29479-279532859499998/ /root/.ansible/tmp/ansible-tmp-1726882777.3299146-29479-279532859499998/AnsiballZ_network_connections.py && sleep 0' 28173 1726882777.43591: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882777.43600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882777.43611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882777.43626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882777.43662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882777.43676: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882777.43687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882777.43699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882777.43707: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882777.43713: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882777.43721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882777.43730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882777.43742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882777.43748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882777.43754: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882777.43765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882777.43833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882777.43851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882777.43859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882777.44004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882777.45805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882777.45891: stderr chunk (state=3): >>><<< 28173 1726882777.45894: stdout chunk (state=3): >>><<< 28173 1726882777.45912: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882777.45921: _low_level_execute_command(): starting 28173 1726882777.45924: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882777.3299146-29479-279532859499998/AnsiballZ_network_connections.py && sleep 0' 28173 1726882777.46550: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882777.46558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882777.46577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882777.46590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882777.46628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882777.46635: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882777.46645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882777.46658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882777.46667: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882777.46678: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882777.46686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882777.46695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882777.46706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882777.46713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882777.46719: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882777.46728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882777.46814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882777.46818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882777.46827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882777.46960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882777.73928: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28173 1726882777.75586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882777.75661: stderr chunk (state=3): >>><<< 28173 1726882777.75666: stdout chunk (state=3): >>><<< 28173 1726882777.75771: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882777.75775: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882777.3299146-29479-279532859499998/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882777.75778: _low_level_execute_command(): starting 28173 1726882777.75780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882777.3299146-29479-279532859499998/ > /dev/null 2>&1 && sleep 0' 28173 1726882777.76390: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882777.76403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882777.76417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882777.76436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882777.76482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882777.76494: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882777.76507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882777.76523: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882777.76533: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882777.76543: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882777.76553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882777.76568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882777.76584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882777.76596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882777.76606: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882777.76618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882777.76701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882777.76717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882777.76731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882777.76863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882777.78747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882777.78752: stdout chunk (state=3): >>><<< 28173 1726882777.78756: stderr chunk (state=3): >>><<< 28173 1726882777.78777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882777.78783: handler run complete 28173 1726882777.78813: attempt loop complete, returning result 28173 1726882777.78816: _execute() done 28173 1726882777.78818: dumping result to json 28173 1726882777.78822: done dumping result, returning 28173 1726882777.78834: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-926c-8928-0000000000c7] 28173 1726882777.78839: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c7 28173 1726882777.78950: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c7 28173 1726882777.78952: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 28173 1726882777.79045: no more pending results, returning what we have 28173 1726882777.79048: results queue empty 28173 1726882777.79049: checking for any_errors_fatal 28173 1726882777.79057: done checking for any_errors_fatal 28173 1726882777.79058: checking for max_fail_percentage 28173 1726882777.79059: done checking for max_fail_percentage 28173 1726882777.79061: checking to see if all hosts have failed and the running result is not ok 28173 1726882777.79061: done checking to see if all hosts have failed 28173 1726882777.79062: getting the remaining hosts for this loop 28173 1726882777.79065: done getting the remaining hosts for this loop 28173 1726882777.79069: getting the next task for host managed_node2 28173 1726882777.79074: done getting next task for host managed_node2 28173 1726882777.79078: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28173 1726882777.79080: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882777.79089: getting variables 28173 1726882777.79090: in VariableManager get_vars() 28173 1726882777.79128: Calling all_inventory to load vars for managed_node2 28173 1726882777.79130: Calling groups_inventory to load vars for managed_node2 28173 1726882777.79132: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882777.79141: Calling all_plugins_play to load vars for managed_node2 28173 1726882777.79143: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882777.79145: Calling groups_plugins_play to load vars for managed_node2 28173 1726882777.80779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882777.82534: done with get_vars() 28173 1726882777.82556: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:39:37 -0400 (0:00:00.696) 0:00:30.990 ****** 28173 1726882777.82636: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28173 1726882777.82924: worker is 1 (out of 1 available) 28173 1726882777.82936: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28173 1726882777.82946: done queuing things up, now waiting for results queue to drain 28173 1726882777.82948: waiting for pending results... 28173 1726882777.83230: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28173 1726882777.83348: in run() - task 0e448fcc-3ce9-926c-8928-0000000000c8 28173 1726882777.83371: variable 'ansible_search_path' from source: unknown 28173 1726882777.83379: variable 'ansible_search_path' from source: unknown 28173 1726882777.83424: calling self._execute() 28173 1726882777.83533: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882777.83547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882777.83562: variable 'omit' from source: magic vars 28173 1726882777.83950: variable 'ansible_distribution_major_version' from source: facts 28173 1726882777.83969: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882777.84103: variable 'network_state' from source: role '' defaults 28173 1726882777.84118: Evaluated conditional (network_state != {}): False 28173 1726882777.84124: when evaluation is False, skipping this task 28173 1726882777.84130: _execute() done 28173 1726882777.84137: dumping result to json 28173 1726882777.84145: done dumping result, returning 28173 1726882777.84157: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-926c-8928-0000000000c8] 28173 1726882777.84170: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c8 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882777.84329: no more pending results, returning what we have 28173 1726882777.84334: results queue empty 28173 1726882777.84335: checking for any_errors_fatal 28173 1726882777.84348: done checking for any_errors_fatal 28173 1726882777.84349: checking for max_fail_percentage 28173 1726882777.84351: done checking for max_fail_percentage 28173 1726882777.84352: checking to see if all hosts have failed and the running result is not ok 28173 1726882777.84353: done checking to see if all hosts have failed 28173 1726882777.84354: getting the remaining hosts for this loop 28173 1726882777.84355: done getting the remaining hosts for this loop 28173 1726882777.84359: getting the next task for host managed_node2 28173 1726882777.84367: done getting next task for host managed_node2 28173 1726882777.84372: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28173 1726882777.84375: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882777.84392: getting variables 28173 1726882777.84394: in VariableManager get_vars() 28173 1726882777.84445: Calling all_inventory to load vars for managed_node2 28173 1726882777.84448: Calling groups_inventory to load vars for managed_node2 28173 1726882777.84450: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882777.84467: Calling all_plugins_play to load vars for managed_node2 28173 1726882777.84471: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882777.84474: Calling groups_plugins_play to load vars for managed_node2 28173 1726882777.85701: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c8 28173 1726882777.85704: WORKER PROCESS EXITING 28173 1726882777.86178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882777.88102: done with get_vars() 28173 1726882777.88129: done getting variables 28173 1726882777.88199: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:39:37 -0400 (0:00:00.055) 0:00:31.046 ****** 28173 1726882777.88233: entering _queue_task() for managed_node2/debug 28173 1726882777.88545: worker is 1 (out of 1 available) 28173 1726882777.88556: exiting _queue_task() for managed_node2/debug 28173 1726882777.88569: done queuing things up, now waiting for results queue to drain 28173 1726882777.88571: waiting for pending results... 28173 1726882777.88857: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28173 1726882777.88987: in run() - task 0e448fcc-3ce9-926c-8928-0000000000c9 28173 1726882777.89013: variable 'ansible_search_path' from source: unknown 28173 1726882777.89021: variable 'ansible_search_path' from source: unknown 28173 1726882777.89070: calling self._execute() 28173 1726882777.89187: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882777.89202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882777.89216: variable 'omit' from source: magic vars 28173 1726882777.89623: variable 'ansible_distribution_major_version' from source: facts 28173 1726882777.89643: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882777.89657: variable 'omit' from source: magic vars 28173 1726882777.89709: variable 'omit' from source: magic vars 28173 1726882777.89752: variable 'omit' from source: magic vars 28173 1726882777.89802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882777.89843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882777.89872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882777.89900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882777.89919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882777.89955: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882777.89966: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882777.89975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882777.90084: Set connection var ansible_pipelining to False 28173 1726882777.90093: Set connection var ansible_shell_type to sh 28173 1726882777.90109: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882777.90122: Set connection var ansible_timeout to 10 28173 1726882777.90132: Set connection var ansible_shell_executable to /bin/sh 28173 1726882777.90141: Set connection var ansible_connection to ssh 28173 1726882777.90168: variable 'ansible_shell_executable' from source: unknown 28173 1726882777.90176: variable 'ansible_connection' from source: unknown 28173 1726882777.90184: variable 'ansible_module_compression' from source: unknown 28173 1726882777.90191: variable 'ansible_shell_type' from source: unknown 28173 1726882777.90198: variable 'ansible_shell_executable' from source: unknown 28173 1726882777.90209: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882777.90218: variable 'ansible_pipelining' from source: unknown 28173 1726882777.90225: variable 'ansible_timeout' from source: unknown 28173 1726882777.90232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882777.90382: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882777.90400: variable 'omit' from source: magic vars 28173 1726882777.90410: starting attempt loop 28173 1726882777.90416: running the handler 28173 1726882777.90553: variable '__network_connections_result' from source: set_fact 28173 1726882777.90612: handler run complete 28173 1726882777.90638: attempt loop complete, returning result 28173 1726882777.90648: _execute() done 28173 1726882777.90654: dumping result to json 28173 1726882777.90661: done dumping result, returning 28173 1726882777.90679: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-926c-8928-0000000000c9] 28173 1726882777.90691: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c9 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 28173 1726882777.90849: no more pending results, returning what we have 28173 1726882777.90853: results queue empty 28173 1726882777.90854: checking for any_errors_fatal 28173 1726882777.90865: done checking for any_errors_fatal 28173 1726882777.90866: checking for max_fail_percentage 28173 1726882777.90868: done checking for max_fail_percentage 28173 1726882777.90869: checking to see if all hosts have failed and the running result is not ok 28173 1726882777.90869: done checking to see if all hosts have failed 28173 1726882777.90870: getting the remaining hosts for this loop 28173 1726882777.90872: done getting the remaining hosts for this loop 28173 1726882777.90876: getting the next task for host managed_node2 28173 1726882777.90882: done getting next task for host managed_node2 28173 1726882777.90887: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28173 1726882777.90889: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882777.90899: getting variables 28173 1726882777.90901: in VariableManager get_vars() 28173 1726882777.90942: Calling all_inventory to load vars for managed_node2 28173 1726882777.90945: Calling groups_inventory to load vars for managed_node2 28173 1726882777.90947: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882777.90959: Calling all_plugins_play to load vars for managed_node2 28173 1726882777.90963: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882777.90968: Calling groups_plugins_play to load vars for managed_node2 28173 1726882777.92170: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000c9 28173 1726882777.92173: WORKER PROCESS EXITING 28173 1726882777.92746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882777.94529: done with get_vars() 28173 1726882777.94552: done getting variables 28173 1726882777.94609: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:39:37 -0400 (0:00:00.064) 0:00:31.110 ****** 28173 1726882777.94639: entering _queue_task() for managed_node2/debug 28173 1726882777.94898: worker is 1 (out of 1 available) 28173 1726882777.94910: exiting _queue_task() for managed_node2/debug 28173 1726882777.94923: done queuing things up, now waiting for results queue to drain 28173 1726882777.94924: waiting for pending results... 28173 1726882777.95210: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28173 1726882777.95313: in run() - task 0e448fcc-3ce9-926c-8928-0000000000ca 28173 1726882777.95337: variable 'ansible_search_path' from source: unknown 28173 1726882777.95344: variable 'ansible_search_path' from source: unknown 28173 1726882777.95395: calling self._execute() 28173 1726882777.95510: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882777.95522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882777.95537: variable 'omit' from source: magic vars 28173 1726882777.95939: variable 'ansible_distribution_major_version' from source: facts 28173 1726882777.95959: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882777.95976: variable 'omit' from source: magic vars 28173 1726882777.96025: variable 'omit' from source: magic vars 28173 1726882777.96068: variable 'omit' from source: magic vars 28173 1726882777.96114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882777.96161: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882777.96189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882777.96212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882777.96234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882777.96269: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882777.96279: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882777.96287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882777.96398: Set connection var ansible_pipelining to False 28173 1726882777.96406: Set connection var ansible_shell_type to sh 28173 1726882777.96419: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882777.96432: Set connection var ansible_timeout to 10 28173 1726882777.96441: Set connection var ansible_shell_executable to /bin/sh 28173 1726882777.96455: Set connection var ansible_connection to ssh 28173 1726882777.96485: variable 'ansible_shell_executable' from source: unknown 28173 1726882777.96494: variable 'ansible_connection' from source: unknown 28173 1726882777.96502: variable 'ansible_module_compression' from source: unknown 28173 1726882777.96509: variable 'ansible_shell_type' from source: unknown 28173 1726882777.96516: variable 'ansible_shell_executable' from source: unknown 28173 1726882777.96523: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882777.96530: variable 'ansible_pipelining' from source: unknown 28173 1726882777.96537: variable 'ansible_timeout' from source: unknown 28173 1726882777.96545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882777.96698: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882777.96715: variable 'omit' from source: magic vars 28173 1726882777.96724: starting attempt loop 28173 1726882777.96731: running the handler 28173 1726882777.96792: variable '__network_connections_result' from source: set_fact 28173 1726882777.96876: variable '__network_connections_result' from source: set_fact 28173 1726882777.96999: handler run complete 28173 1726882777.97028: attempt loop complete, returning result 28173 1726882777.97034: _execute() done 28173 1726882777.97040: dumping result to json 28173 1726882777.97048: done dumping result, returning 28173 1726882777.97059: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-926c-8928-0000000000ca] 28173 1726882777.97073: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ca 28173 1726882777.97185: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ca 28173 1726882777.97192: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 28173 1726882777.97292: no more pending results, returning what we have 28173 1726882777.97296: results queue empty 28173 1726882777.97296: checking for any_errors_fatal 28173 1726882777.97303: done checking for any_errors_fatal 28173 1726882777.97304: checking for max_fail_percentage 28173 1726882777.97305: done checking for max_fail_percentage 28173 1726882777.97306: checking to see if all hosts have failed and the running result is not ok 28173 1726882777.97307: done checking to see if all hosts have failed 28173 1726882777.97308: getting the remaining hosts for this loop 28173 1726882777.97309: done getting the remaining hosts for this loop 28173 1726882777.97312: getting the next task for host managed_node2 28173 1726882777.97318: done getting next task for host managed_node2 28173 1726882777.97321: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28173 1726882777.97323: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882777.97332: getting variables 28173 1726882777.97333: in VariableManager get_vars() 28173 1726882777.97369: Calling all_inventory to load vars for managed_node2 28173 1726882777.97371: Calling groups_inventory to load vars for managed_node2 28173 1726882777.97374: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882777.97385: Calling all_plugins_play to load vars for managed_node2 28173 1726882777.97388: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882777.97391: Calling groups_plugins_play to load vars for managed_node2 28173 1726882777.99214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882778.00978: done with get_vars() 28173 1726882778.01001: done getting variables 28173 1726882778.01059: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:39:38 -0400 (0:00:00.064) 0:00:31.175 ****** 28173 1726882778.01097: entering _queue_task() for managed_node2/debug 28173 1726882778.01377: worker is 1 (out of 1 available) 28173 1726882778.01388: exiting _queue_task() for managed_node2/debug 28173 1726882778.01400: done queuing things up, now waiting for results queue to drain 28173 1726882778.01401: waiting for pending results... 28173 1726882778.01686: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28173 1726882778.01806: in run() - task 0e448fcc-3ce9-926c-8928-0000000000cb 28173 1726882778.01825: variable 'ansible_search_path' from source: unknown 28173 1726882778.01832: variable 'ansible_search_path' from source: unknown 28173 1726882778.01876: calling self._execute() 28173 1726882778.01982: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882778.01992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882778.02007: variable 'omit' from source: magic vars 28173 1726882778.02377: variable 'ansible_distribution_major_version' from source: facts 28173 1726882778.02399: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882778.02524: variable 'network_state' from source: role '' defaults 28173 1726882778.02539: Evaluated conditional (network_state != {}): False 28173 1726882778.02547: when evaluation is False, skipping this task 28173 1726882778.02554: _execute() done 28173 1726882778.02561: dumping result to json 28173 1726882778.02570: done dumping result, returning 28173 1726882778.02580: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-926c-8928-0000000000cb] 28173 1726882778.02590: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000cb 28173 1726882778.02697: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000cb 28173 1726882778.02706: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28173 1726882778.02753: no more pending results, returning what we have 28173 1726882778.02757: results queue empty 28173 1726882778.02758: checking for any_errors_fatal 28173 1726882778.02772: done checking for any_errors_fatal 28173 1726882778.02773: checking for max_fail_percentage 28173 1726882778.02775: done checking for max_fail_percentage 28173 1726882778.02776: checking to see if all hosts have failed and the running result is not ok 28173 1726882778.02777: done checking to see if all hosts have failed 28173 1726882778.02778: getting the remaining hosts for this loop 28173 1726882778.02779: done getting the remaining hosts for this loop 28173 1726882778.02783: getting the next task for host managed_node2 28173 1726882778.02789: done getting next task for host managed_node2 28173 1726882778.02793: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28173 1726882778.02796: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882778.02811: getting variables 28173 1726882778.02813: in VariableManager get_vars() 28173 1726882778.02851: Calling all_inventory to load vars for managed_node2 28173 1726882778.02853: Calling groups_inventory to load vars for managed_node2 28173 1726882778.02856: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882778.02870: Calling all_plugins_play to load vars for managed_node2 28173 1726882778.02873: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882778.02876: Calling groups_plugins_play to load vars for managed_node2 28173 1726882778.04535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882778.06385: done with get_vars() 28173 1726882778.06408: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:39:38 -0400 (0:00:00.054) 0:00:31.229 ****** 28173 1726882778.06502: entering _queue_task() for managed_node2/ping 28173 1726882778.06774: worker is 1 (out of 1 available) 28173 1726882778.06786: exiting _queue_task() for managed_node2/ping 28173 1726882778.06799: done queuing things up, now waiting for results queue to drain 28173 1726882778.06800: waiting for pending results... 28173 1726882778.07087: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28173 1726882778.07210: in run() - task 0e448fcc-3ce9-926c-8928-0000000000cc 28173 1726882778.07231: variable 'ansible_search_path' from source: unknown 28173 1726882778.07240: variable 'ansible_search_path' from source: unknown 28173 1726882778.07287: calling self._execute() 28173 1726882778.07396: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882778.07406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882778.07421: variable 'omit' from source: magic vars 28173 1726882778.07802: variable 'ansible_distribution_major_version' from source: facts 28173 1726882778.07819: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882778.07830: variable 'omit' from source: magic vars 28173 1726882778.07874: variable 'omit' from source: magic vars 28173 1726882778.07918: variable 'omit' from source: magic vars 28173 1726882778.07961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882778.08003: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882778.08028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882778.08050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882778.08070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882778.08102: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882778.08110: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882778.08122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882778.08221: Set connection var ansible_pipelining to False 28173 1726882778.08230: Set connection var ansible_shell_type to sh 28173 1726882778.08239: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882778.08248: Set connection var ansible_timeout to 10 28173 1726882778.08254: Set connection var ansible_shell_executable to /bin/sh 28173 1726882778.08260: Set connection var ansible_connection to ssh 28173 1726882778.08286: variable 'ansible_shell_executable' from source: unknown 28173 1726882778.08292: variable 'ansible_connection' from source: unknown 28173 1726882778.08298: variable 'ansible_module_compression' from source: unknown 28173 1726882778.08302: variable 'ansible_shell_type' from source: unknown 28173 1726882778.08307: variable 'ansible_shell_executable' from source: unknown 28173 1726882778.08311: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882778.08316: variable 'ansible_pipelining' from source: unknown 28173 1726882778.08321: variable 'ansible_timeout' from source: unknown 28173 1726882778.08326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882778.08518: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882778.08535: variable 'omit' from source: magic vars 28173 1726882778.08543: starting attempt loop 28173 1726882778.08553: running the handler 28173 1726882778.08571: _low_level_execute_command(): starting 28173 1726882778.08582: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882778.09339: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882778.09354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.09370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.09390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.09432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.09445: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882778.09459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.09481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882778.09493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882778.09503: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882778.09513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.09525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.09539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.09550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.09565: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882778.09581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.09655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882778.09682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882778.09698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882778.09833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882778.11495: stdout chunk (state=3): >>>/root <<< 28173 1726882778.11684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882778.11688: stdout chunk (state=3): >>><<< 28173 1726882778.11690: stderr chunk (state=3): >>><<< 28173 1726882778.11795: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882778.11799: _low_level_execute_command(): starting 28173 1726882778.11802: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882778.1170962-29536-14147913780262 `" && echo ansible-tmp-1726882778.1170962-29536-14147913780262="` echo /root/.ansible/tmp/ansible-tmp-1726882778.1170962-29536-14147913780262 `" ) && sleep 0' 28173 1726882778.12401: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882778.12415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.12433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.12453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.12496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.12509: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882778.12523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.12545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882778.12558: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882778.12574: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882778.12587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.12601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.12617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.12631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.12649: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882778.12663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.12740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882778.12768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882778.12785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882778.12914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882778.14775: stdout chunk (state=3): >>>ansible-tmp-1726882778.1170962-29536-14147913780262=/root/.ansible/tmp/ansible-tmp-1726882778.1170962-29536-14147913780262 <<< 28173 1726882778.14890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882778.14937: stderr chunk (state=3): >>><<< 28173 1726882778.14971: stdout chunk (state=3): >>><<< 28173 1726882778.15259: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882778.1170962-29536-14147913780262=/root/.ansible/tmp/ansible-tmp-1726882778.1170962-29536-14147913780262 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882778.15263: variable 'ansible_module_compression' from source: unknown 28173 1726882778.15274: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28173 1726882778.15276: variable 'ansible_facts' from source: unknown 28173 1726882778.15278: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882778.1170962-29536-14147913780262/AnsiballZ_ping.py 28173 1726882778.15355: Sending initial data 28173 1726882778.15358: Sent initial data (152 bytes) 28173 1726882778.16320: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.16324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.16357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.16360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.16370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.16417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882778.16421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882778.16524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882778.18256: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882778.18344: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882778.18443: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpnyn0vh73 /root/.ansible/tmp/ansible-tmp-1726882778.1170962-29536-14147913780262/AnsiballZ_ping.py <<< 28173 1726882778.18539: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882778.19810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882778.19871: stderr chunk (state=3): >>><<< 28173 1726882778.19874: stdout chunk (state=3): >>><<< 28173 1726882778.19883: done transferring module to remote 28173 1726882778.19892: _low_level_execute_command(): starting 28173 1726882778.19896: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882778.1170962-29536-14147913780262/ /root/.ansible/tmp/ansible-tmp-1726882778.1170962-29536-14147913780262/AnsiballZ_ping.py && sleep 0' 28173 1726882778.20298: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.20304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.20331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.20338: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882778.20346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.20355: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882778.20362: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882778.20370: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.20381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.20386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882778.20397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.20440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882778.20461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882778.20470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882778.20567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882778.22312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882778.22358: stderr chunk (state=3): >>><<< 28173 1726882778.22361: stdout chunk (state=3): >>><<< 28173 1726882778.22380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882778.22383: _low_level_execute_command(): starting 28173 1726882778.22386: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882778.1170962-29536-14147913780262/AnsiballZ_ping.py && sleep 0' 28173 1726882778.22971: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882778.22981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.22991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.23005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.23041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.23049: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882778.23059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.23074: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882778.23083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882778.23093: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882778.23100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.23109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.23121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.23128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.23135: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882778.23144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.23220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882778.23236: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882778.23247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882778.23378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882778.36414: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28173 1726882778.37617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882778.37621: stdout chunk (state=3): >>><<< 28173 1726882778.37626: stderr chunk (state=3): >>><<< 28173 1726882778.37650: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882778.37677: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882778.1170962-29536-14147913780262/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882778.37686: _low_level_execute_command(): starting 28173 1726882778.37691: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882778.1170962-29536-14147913780262/ > /dev/null 2>&1 && sleep 0' 28173 1726882778.38337: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882778.38354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.38370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.38381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.38432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.38439: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882778.38448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.38462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882778.38471: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882778.38478: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882778.38486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.38495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.38507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.38516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.38524: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882778.38537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.38619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882778.38637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882778.38651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882778.38777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882778.40658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882778.40662: stdout chunk (state=3): >>><<< 28173 1726882778.40677: stderr chunk (state=3): >>><<< 28173 1726882778.40691: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882778.40698: handler run complete 28173 1726882778.40713: attempt loop complete, returning result 28173 1726882778.40716: _execute() done 28173 1726882778.40718: dumping result to json 28173 1726882778.40720: done dumping result, returning 28173 1726882778.40730: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-926c-8928-0000000000cc] 28173 1726882778.40736: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000cc 28173 1726882778.40831: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000cc 28173 1726882778.40834: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28173 1726882778.41021: no more pending results, returning what we have 28173 1726882778.41024: results queue empty 28173 1726882778.41025: checking for any_errors_fatal 28173 1726882778.41030: done checking for any_errors_fatal 28173 1726882778.41031: checking for max_fail_percentage 28173 1726882778.41032: done checking for max_fail_percentage 28173 1726882778.41033: checking to see if all hosts have failed and the running result is not ok 28173 1726882778.41034: done checking to see if all hosts have failed 28173 1726882778.41035: getting the remaining hosts for this loop 28173 1726882778.41036: done getting the remaining hosts for this loop 28173 1726882778.41040: getting the next task for host managed_node2 28173 1726882778.41047: done getting next task for host managed_node2 28173 1726882778.41049: ^ task is: TASK: meta (role_complete) 28173 1726882778.41051: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882778.41059: getting variables 28173 1726882778.41061: in VariableManager get_vars() 28173 1726882778.41103: Calling all_inventory to load vars for managed_node2 28173 1726882778.41106: Calling groups_inventory to load vars for managed_node2 28173 1726882778.41108: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882778.41119: Calling all_plugins_play to load vars for managed_node2 28173 1726882778.41122: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882778.41125: Calling groups_plugins_play to load vars for managed_node2 28173 1726882778.42818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882778.45293: done with get_vars() 28173 1726882778.45320: done getting variables 28173 1726882778.45405: done queuing things up, now waiting for results queue to drain 28173 1726882778.45407: results queue empty 28173 1726882778.45408: checking for any_errors_fatal 28173 1726882778.45411: done checking for any_errors_fatal 28173 1726882778.45412: checking for max_fail_percentage 28173 1726882778.45413: done checking for max_fail_percentage 28173 1726882778.45414: checking to see if all hosts have failed and the running result is not ok 28173 1726882778.45415: done checking to see if all hosts have failed 28173 1726882778.45415: getting the remaining hosts for this loop 28173 1726882778.45416: done getting the remaining hosts for this loop 28173 1726882778.45419: getting the next task for host managed_node2 28173 1726882778.45422: done getting next task for host managed_node2 28173 1726882778.45424: ^ task is: TASK: meta (flush_handlers) 28173 1726882778.45425: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882778.45428: getting variables 28173 1726882778.45429: in VariableManager get_vars() 28173 1726882778.45441: Calling all_inventory to load vars for managed_node2 28173 1726882778.45443: Calling groups_inventory to load vars for managed_node2 28173 1726882778.45445: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882778.45450: Calling all_plugins_play to load vars for managed_node2 28173 1726882778.45453: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882778.45456: Calling groups_plugins_play to load vars for managed_node2 28173 1726882778.48237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882778.51404: done with get_vars() 28173 1726882778.51428: done getting variables 28173 1726882778.51482: in VariableManager get_vars() 28173 1726882778.51496: Calling all_inventory to load vars for managed_node2 28173 1726882778.51498: Calling groups_inventory to load vars for managed_node2 28173 1726882778.51500: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882778.51505: Calling all_plugins_play to load vars for managed_node2 28173 1726882778.51507: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882778.51510: Calling groups_plugins_play to load vars for managed_node2 28173 1726882778.53257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882778.55077: done with get_vars() 28173 1726882778.55107: done queuing things up, now waiting for results queue to drain 28173 1726882778.55109: results queue empty 28173 1726882778.55110: checking for any_errors_fatal 28173 1726882778.55112: done checking for any_errors_fatal 28173 1726882778.55112: checking for max_fail_percentage 28173 1726882778.55114: done checking for max_fail_percentage 28173 1726882778.55114: checking to see if all hosts have failed and the running result is not ok 28173 1726882778.55115: done checking to see if all hosts have failed 28173 1726882778.55116: getting the remaining hosts for this loop 28173 1726882778.55117: done getting the remaining hosts for this loop 28173 1726882778.55120: getting the next task for host managed_node2 28173 1726882778.55124: done getting next task for host managed_node2 28173 1726882778.55125: ^ task is: TASK: meta (flush_handlers) 28173 1726882778.55127: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882778.55129: getting variables 28173 1726882778.55131: in VariableManager get_vars() 28173 1726882778.55142: Calling all_inventory to load vars for managed_node2 28173 1726882778.55144: Calling groups_inventory to load vars for managed_node2 28173 1726882778.55146: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882778.55153: Calling all_plugins_play to load vars for managed_node2 28173 1726882778.55156: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882778.55164: Calling groups_plugins_play to load vars for managed_node2 28173 1726882778.57619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882778.61938: done with get_vars() 28173 1726882778.61975: done getting variables 28173 1726882778.62027: in VariableManager get_vars() 28173 1726882778.62041: Calling all_inventory to load vars for managed_node2 28173 1726882778.62043: Calling groups_inventory to load vars for managed_node2 28173 1726882778.62045: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882778.62051: Calling all_plugins_play to load vars for managed_node2 28173 1726882778.62053: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882778.62056: Calling groups_plugins_play to load vars for managed_node2 28173 1726882778.63458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882778.65248: done with get_vars() 28173 1726882778.65279: done queuing things up, now waiting for results queue to drain 28173 1726882778.65282: results queue empty 28173 1726882778.65282: checking for any_errors_fatal 28173 1726882778.65284: done checking for any_errors_fatal 28173 1726882778.65284: checking for max_fail_percentage 28173 1726882778.65285: done checking for max_fail_percentage 28173 1726882778.65286: checking to see if all hosts have failed and the running result is not ok 28173 1726882778.65287: done checking to see if all hosts have failed 28173 1726882778.65288: getting the remaining hosts for this loop 28173 1726882778.65288: done getting the remaining hosts for this loop 28173 1726882778.65291: getting the next task for host managed_node2 28173 1726882778.65294: done getting next task for host managed_node2 28173 1726882778.65295: ^ task is: None 28173 1726882778.65297: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882778.65298: done queuing things up, now waiting for results queue to drain 28173 1726882778.65299: results queue empty 28173 1726882778.65300: checking for any_errors_fatal 28173 1726882778.65301: done checking for any_errors_fatal 28173 1726882778.65301: checking for max_fail_percentage 28173 1726882778.65302: done checking for max_fail_percentage 28173 1726882778.65303: checking to see if all hosts have failed and the running result is not ok 28173 1726882778.65303: done checking to see if all hosts have failed 28173 1726882778.65305: getting the next task for host managed_node2 28173 1726882778.65307: done getting next task for host managed_node2 28173 1726882778.65308: ^ task is: None 28173 1726882778.65309: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882778.65357: in VariableManager get_vars() 28173 1726882778.65381: done with get_vars() 28173 1726882778.65388: in VariableManager get_vars() 28173 1726882778.65398: done with get_vars() 28173 1726882778.65402: variable 'omit' from source: magic vars 28173 1726882778.65435: in VariableManager get_vars() 28173 1726882778.65446: done with get_vars() 28173 1726882778.65470: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 28173 1726882778.65730: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28173 1726882778.65753: getting the remaining hosts for this loop 28173 1726882778.65754: done getting the remaining hosts for this loop 28173 1726882778.65756: getting the next task for host managed_node2 28173 1726882778.65759: done getting next task for host managed_node2 28173 1726882778.65761: ^ task is: TASK: Gathering Facts 28173 1726882778.65762: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882778.65766: getting variables 28173 1726882778.65767: in VariableManager get_vars() 28173 1726882778.65776: Calling all_inventory to load vars for managed_node2 28173 1726882778.65778: Calling groups_inventory to load vars for managed_node2 28173 1726882778.65780: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882778.65785: Calling all_plugins_play to load vars for managed_node2 28173 1726882778.65788: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882778.65791: Calling groups_plugins_play to load vars for managed_node2 28173 1726882778.67092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882778.68888: done with get_vars() 28173 1726882778.68907: done getting variables 28173 1726882778.68948: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 21:39:38 -0400 (0:00:00.624) 0:00:31.854 ****** 28173 1726882778.68979: entering _queue_task() for managed_node2/gather_facts 28173 1726882778.69306: worker is 1 (out of 1 available) 28173 1726882778.69322: exiting _queue_task() for managed_node2/gather_facts 28173 1726882778.69333: done queuing things up, now waiting for results queue to drain 28173 1726882778.69334: waiting for pending results... 28173 1726882778.69615: running TaskExecutor() for managed_node2/TASK: Gathering Facts 28173 1726882778.69722: in run() - task 0e448fcc-3ce9-926c-8928-00000000076f 28173 1726882778.69742: variable 'ansible_search_path' from source: unknown 28173 1726882778.69789: calling self._execute() 28173 1726882778.69896: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882778.69907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882778.69920: variable 'omit' from source: magic vars 28173 1726882778.70324: variable 'ansible_distribution_major_version' from source: facts 28173 1726882778.70342: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882778.70353: variable 'omit' from source: magic vars 28173 1726882778.70416: variable 'omit' from source: magic vars 28173 1726882778.70457: variable 'omit' from source: magic vars 28173 1726882778.70501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882778.70546: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882778.70571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882778.70593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882778.70611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882778.70653: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882778.70661: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882778.70672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882778.70778: Set connection var ansible_pipelining to False 28173 1726882778.70786: Set connection var ansible_shell_type to sh 28173 1726882778.70799: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882778.70812: Set connection var ansible_timeout to 10 28173 1726882778.70821: Set connection var ansible_shell_executable to /bin/sh 28173 1726882778.70831: Set connection var ansible_connection to ssh 28173 1726882778.70862: variable 'ansible_shell_executable' from source: unknown 28173 1726882778.70874: variable 'ansible_connection' from source: unknown 28173 1726882778.70881: variable 'ansible_module_compression' from source: unknown 28173 1726882778.70889: variable 'ansible_shell_type' from source: unknown 28173 1726882778.70895: variable 'ansible_shell_executable' from source: unknown 28173 1726882778.70901: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882778.70908: variable 'ansible_pipelining' from source: unknown 28173 1726882778.70914: variable 'ansible_timeout' from source: unknown 28173 1726882778.70922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882778.71112: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882778.71129: variable 'omit' from source: magic vars 28173 1726882778.71139: starting attempt loop 28173 1726882778.71146: running the handler 28173 1726882778.71170: variable 'ansible_facts' from source: unknown 28173 1726882778.71201: _low_level_execute_command(): starting 28173 1726882778.71215: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882778.71942: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882778.71957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.71977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.71999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.72041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.72054: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882778.72074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.72101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882778.72115: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882778.72127: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882778.72139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.72153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.72172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.72185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.72205: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882778.72222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.72296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882778.72326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882778.72342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882778.72480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882778.74142: stdout chunk (state=3): >>>/root <<< 28173 1726882778.74247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882778.74326: stderr chunk (state=3): >>><<< 28173 1726882778.74338: stdout chunk (state=3): >>><<< 28173 1726882778.74447: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882778.74472: _low_level_execute_command(): starting 28173 1726882778.74475: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882778.7436564-29575-8059400849097 `" && echo ansible-tmp-1726882778.7436564-29575-8059400849097="` echo /root/.ansible/tmp/ansible-tmp-1726882778.7436564-29575-8059400849097 `" ) && sleep 0' 28173 1726882778.75077: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882778.75091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.75106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.75124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.75175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.75187: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882778.75201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.75219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882778.75231: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882778.75243: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882778.75260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.75277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.75292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.75303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.75314: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882778.75327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.75408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882778.75429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882778.75446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882778.75588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882778.77459: stdout chunk (state=3): >>>ansible-tmp-1726882778.7436564-29575-8059400849097=/root/.ansible/tmp/ansible-tmp-1726882778.7436564-29575-8059400849097 <<< 28173 1726882778.77652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882778.77656: stdout chunk (state=3): >>><<< 28173 1726882778.77658: stderr chunk (state=3): >>><<< 28173 1726882778.77943: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882778.7436564-29575-8059400849097=/root/.ansible/tmp/ansible-tmp-1726882778.7436564-29575-8059400849097 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882778.77946: variable 'ansible_module_compression' from source: unknown 28173 1726882778.77948: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28173 1726882778.77950: variable 'ansible_facts' from source: unknown 28173 1726882778.77990: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882778.7436564-29575-8059400849097/AnsiballZ_setup.py 28173 1726882778.78155: Sending initial data 28173 1726882778.78158: Sent initial data (152 bytes) 28173 1726882778.80078: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882778.80091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.80105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.80121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.80187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.80199: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882778.80212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.80229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882778.80400: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882778.80412: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882778.80425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.80444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.80460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.80476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.80487: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882778.80501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.80581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882778.80602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882778.80617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882778.80747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882778.82501: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882778.82600: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882778.82700: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp6at57h18 /root/.ansible/tmp/ansible-tmp-1726882778.7436564-29575-8059400849097/AnsiballZ_setup.py <<< 28173 1726882778.82796: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882778.86181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882778.86367: stderr chunk (state=3): >>><<< 28173 1726882778.86371: stdout chunk (state=3): >>><<< 28173 1726882778.86373: done transferring module to remote 28173 1726882778.86379: _low_level_execute_command(): starting 28173 1726882778.86382: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882778.7436564-29575-8059400849097/ /root/.ansible/tmp/ansible-tmp-1726882778.7436564-29575-8059400849097/AnsiballZ_setup.py && sleep 0' 28173 1726882778.86982: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882778.86995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.87008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.87035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.87077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.87089: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882778.87102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.87118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882778.87134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882778.87144: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882778.87155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.87170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.87185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.87196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.87205: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882778.87217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.87299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882778.87318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882778.87332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882778.87459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882778.89235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882778.89307: stderr chunk (state=3): >>><<< 28173 1726882778.89317: stdout chunk (state=3): >>><<< 28173 1726882778.89409: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882778.89412: _low_level_execute_command(): starting 28173 1726882778.89415: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882778.7436564-29575-8059400849097/AnsiballZ_setup.py && sleep 0' 28173 1726882778.90170: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882778.90186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.90200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.90216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.90256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.90277: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882778.90297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.90316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882778.90327: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882778.90337: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882778.90347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882778.90359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882778.90380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882778.90395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882778.90405: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882778.90417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882778.90495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882778.90514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882778.90527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882778.90660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882779.46647: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_loadavg": {"1m": 0.38, "5m": 0.4, "15m": 0.25}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3279, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chas<<< 28173 1726882779.46688: stdout chunk (state=3): >>>sis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 718, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238379008, "block_size": 4096, "block_total": 65519355, "block_available": 64511323, "block_used": 1008032, "inode_total": 131071472, "inode_available": 130998691, "inode_used": 72781, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYp<<< 28173 1726882779.46712: stdout chunk (state=3): >>>ZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "39", "epoch": "1726882779", "epoch_int": "1726882779", "date": "2024-09-20", "time": "21:39:39", "iso8601_micro": "2024-09-21T01:39:39.391479Z", "iso8601": "2024-09-21T01:39:39Z", "iso8601_basic": "20240920T213939391479", "iso8601_basic_short": "20240920T213939", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["rpltstbr", "peerethtest0", "ethtest0", "lo", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "92:35:3e:53:1a:d5", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::9035:3eff:fe53:1ad5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ce:7d:c7:1b:e6:34", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::38f0:8f6a:fdbd:1536", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::9035:3eff:fe53:1ad5", "fe80::38f0:8f6a:fdbd:1536", "fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1", "fe80::38f0:8f6a:fdbd:1536", "fe80::9035:3eff:fe53:1ad5"]}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28173 1726882779.48388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882779.48467: stderr chunk (state=3): >>><<< 28173 1726882779.48470: stdout chunk (state=3): >>><<< 28173 1726882779.48711: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_loadavg": {"1m": 0.38, "5m": 0.4, "15m": 0.25}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3279, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 718, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238379008, "block_size": 4096, "block_total": 65519355, "block_available": 64511323, "block_used": 1008032, "inode_total": 131071472, "inode_available": 130998691, "inode_used": 72781, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "39", "epoch": "1726882779", "epoch_int": "1726882779", "date": "2024-09-20", "time": "21:39:39", "iso8601_micro": "2024-09-21T01:39:39.391479Z", "iso8601": "2024-09-21T01:39:39Z", "iso8601_basic": "20240920T213939391479", "iso8601_basic_short": "20240920T213939", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["rpltstbr", "peerethtest0", "ethtest0", "lo", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "92:35:3e:53:1a:d5", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::9035:3eff:fe53:1ad5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ce:7d:c7:1b:e6:34", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::38f0:8f6a:fdbd:1536", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::9035:3eff:fe53:1ad5", "fe80::38f0:8f6a:fdbd:1536", "fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1", "fe80::38f0:8f6a:fdbd:1536", "fe80::9035:3eff:fe53:1ad5"]}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882779.49381: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882778.7436564-29575-8059400849097/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882779.49410: _low_level_execute_command(): starting 28173 1726882779.49420: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882778.7436564-29575-8059400849097/ > /dev/null 2>&1 && sleep 0' 28173 1726882779.52635: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882779.52652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882779.52670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882779.52692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882779.52744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882779.52844: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882779.52860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882779.52882: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882779.52895: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882779.52907: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882779.52919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882779.52933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882779.52956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882779.52971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882779.52983: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882779.52998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882779.53297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882779.53321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882779.53339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882779.53475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882779.55422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882779.55425: stdout chunk (state=3): >>><<< 28173 1726882779.55428: stderr chunk (state=3): >>><<< 28173 1726882779.55674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882779.55677: handler run complete 28173 1726882779.55680: variable 'ansible_facts' from source: unknown 28173 1726882779.55749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882779.56416: variable 'ansible_facts' from source: unknown 28173 1726882779.56982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882779.57188: attempt loop complete, returning result 28173 1726882779.57423: _execute() done 28173 1726882779.57433: dumping result to json 28173 1726882779.57485: done dumping result, returning 28173 1726882779.57657: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-926c-8928-00000000076f] 28173 1726882779.57751: sending task result for task 0e448fcc-3ce9-926c-8928-00000000076f ok: [managed_node2] 28173 1726882779.59285: no more pending results, returning what we have 28173 1726882779.59289: results queue empty 28173 1726882779.59290: checking for any_errors_fatal 28173 1726882779.59291: done checking for any_errors_fatal 28173 1726882779.59292: checking for max_fail_percentage 28173 1726882779.59293: done checking for max_fail_percentage 28173 1726882779.59294: checking to see if all hosts have failed and the running result is not ok 28173 1726882779.59295: done checking to see if all hosts have failed 28173 1726882779.59295: getting the remaining hosts for this loop 28173 1726882779.59297: done getting the remaining hosts for this loop 28173 1726882779.59300: getting the next task for host managed_node2 28173 1726882779.59306: done getting next task for host managed_node2 28173 1726882779.59308: ^ task is: TASK: meta (flush_handlers) 28173 1726882779.59310: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882779.59314: getting variables 28173 1726882779.59315: in VariableManager get_vars() 28173 1726882779.59339: Calling all_inventory to load vars for managed_node2 28173 1726882779.59341: Calling groups_inventory to load vars for managed_node2 28173 1726882779.59344: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882779.59357: Calling all_plugins_play to load vars for managed_node2 28173 1726882779.59360: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882779.59362: Calling groups_plugins_play to load vars for managed_node2 28173 1726882779.60366: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000076f 28173 1726882779.60371: WORKER PROCESS EXITING 28173 1726882779.63562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882779.66711: done with get_vars() 28173 1726882779.66738: done getting variables 28173 1726882779.66813: in VariableManager get_vars() 28173 1726882779.66824: Calling all_inventory to load vars for managed_node2 28173 1726882779.66827: Calling groups_inventory to load vars for managed_node2 28173 1726882779.66829: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882779.66834: Calling all_plugins_play to load vars for managed_node2 28173 1726882779.66836: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882779.66844: Calling groups_plugins_play to load vars for managed_node2 28173 1726882779.70189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882779.73218: done with get_vars() 28173 1726882779.73246: done queuing things up, now waiting for results queue to drain 28173 1726882779.73248: results queue empty 28173 1726882779.73249: checking for any_errors_fatal 28173 1726882779.73253: done checking for any_errors_fatal 28173 1726882779.73254: checking for max_fail_percentage 28173 1726882779.73255: done checking for max_fail_percentage 28173 1726882779.73256: checking to see if all hosts have failed and the running result is not ok 28173 1726882779.73257: done checking to see if all hosts have failed 28173 1726882779.73257: getting the remaining hosts for this loop 28173 1726882779.73258: done getting the remaining hosts for this loop 28173 1726882779.73261: getting the next task for host managed_node2 28173 1726882779.73269: done getting next task for host managed_node2 28173 1726882779.73272: ^ task is: TASK: Include the task 'delete_interface.yml' 28173 1726882779.73274: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882779.73276: getting variables 28173 1726882779.73277: in VariableManager get_vars() 28173 1726882779.73287: Calling all_inventory to load vars for managed_node2 28173 1726882779.73289: Calling groups_inventory to load vars for managed_node2 28173 1726882779.73291: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882779.73296: Calling all_plugins_play to load vars for managed_node2 28173 1726882779.73299: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882779.73301: Calling groups_plugins_play to load vars for managed_node2 28173 1726882779.76184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882779.88245: done with get_vars() 28173 1726882779.88280: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 21:39:39 -0400 (0:00:01.193) 0:00:33.047 ****** 28173 1726882779.88354: entering _queue_task() for managed_node2/include_tasks 28173 1726882779.88992: worker is 1 (out of 1 available) 28173 1726882779.89006: exiting _queue_task() for managed_node2/include_tasks 28173 1726882779.89018: done queuing things up, now waiting for results queue to drain 28173 1726882779.89020: waiting for pending results... 28173 1726882779.90015: running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' 28173 1726882779.90326: in run() - task 0e448fcc-3ce9-926c-8928-0000000000cf 28173 1726882779.90392: variable 'ansible_search_path' from source: unknown 28173 1726882779.90446: calling self._execute() 28173 1726882779.90622: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882779.90635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882779.90650: variable 'omit' from source: magic vars 28173 1726882779.91079: variable 'ansible_distribution_major_version' from source: facts 28173 1726882779.91098: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882779.91109: _execute() done 28173 1726882779.91117: dumping result to json 28173 1726882779.91124: done dumping result, returning 28173 1726882779.91138: done running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' [0e448fcc-3ce9-926c-8928-0000000000cf] 28173 1726882779.91150: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000cf 28173 1726882779.91296: no more pending results, returning what we have 28173 1726882779.91302: in VariableManager get_vars() 28173 1726882779.91340: Calling all_inventory to load vars for managed_node2 28173 1726882779.91343: Calling groups_inventory to load vars for managed_node2 28173 1726882779.91347: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882779.91362: Calling all_plugins_play to load vars for managed_node2 28173 1726882779.91367: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882779.91371: Calling groups_plugins_play to load vars for managed_node2 28173 1726882779.92483: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000cf 28173 1726882779.92487: WORKER PROCESS EXITING 28173 1726882779.93924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882779.96110: done with get_vars() 28173 1726882779.96132: variable 'ansible_search_path' from source: unknown 28173 1726882779.96148: we have included files to process 28173 1726882779.96149: generating all_blocks data 28173 1726882779.96150: done generating all_blocks data 28173 1726882779.96151: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28173 1726882779.96152: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28173 1726882779.96154: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28173 1726882779.96397: done processing included file 28173 1726882779.96399: iterating over new_blocks loaded from include file 28173 1726882779.96400: in VariableManager get_vars() 28173 1726882779.96414: done with get_vars() 28173 1726882779.96416: filtering new block on tags 28173 1726882779.96454: done filtering new block on tags 28173 1726882779.96456: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 28173 1726882779.96462: extending task lists for all hosts with included blocks 28173 1726882779.96501: done extending task lists 28173 1726882779.96503: done processing included files 28173 1726882779.96503: results queue empty 28173 1726882779.96504: checking for any_errors_fatal 28173 1726882779.96514: done checking for any_errors_fatal 28173 1726882779.96914: checking for max_fail_percentage 28173 1726882779.96916: done checking for max_fail_percentage 28173 1726882779.96917: checking to see if all hosts have failed and the running result is not ok 28173 1726882779.96918: done checking to see if all hosts have failed 28173 1726882779.96918: getting the remaining hosts for this loop 28173 1726882779.96920: done getting the remaining hosts for this loop 28173 1726882779.96923: getting the next task for host managed_node2 28173 1726882779.96927: done getting next task for host managed_node2 28173 1726882779.96929: ^ task is: TASK: Remove test interface if necessary 28173 1726882779.96931: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882779.96934: getting variables 28173 1726882779.96934: in VariableManager get_vars() 28173 1726882779.96943: Calling all_inventory to load vars for managed_node2 28173 1726882779.96945: Calling groups_inventory to load vars for managed_node2 28173 1726882779.96947: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882779.96952: Calling all_plugins_play to load vars for managed_node2 28173 1726882779.96954: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882779.96957: Calling groups_plugins_play to load vars for managed_node2 28173 1726882779.98767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882780.02081: done with get_vars() 28173 1726882780.02139: done getting variables 28173 1726882780.02233: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:39:40 -0400 (0:00:00.139) 0:00:33.187 ****** 28173 1726882780.02267: entering _queue_task() for managed_node2/command 28173 1726882780.03517: worker is 1 (out of 1 available) 28173 1726882780.03531: exiting _queue_task() for managed_node2/command 28173 1726882780.03543: done queuing things up, now waiting for results queue to drain 28173 1726882780.03544: waiting for pending results... 28173 1726882780.04178: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 28173 1726882780.04320: in run() - task 0e448fcc-3ce9-926c-8928-000000000780 28173 1726882780.04360: variable 'ansible_search_path' from source: unknown 28173 1726882780.04372: variable 'ansible_search_path' from source: unknown 28173 1726882780.04416: calling self._execute() 28173 1726882780.04532: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882780.04550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882780.04572: variable 'omit' from source: magic vars 28173 1726882780.06584: variable 'ansible_distribution_major_version' from source: facts 28173 1726882780.06608: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882780.06619: variable 'omit' from source: magic vars 28173 1726882780.06659: variable 'omit' from source: magic vars 28173 1726882780.06954: variable 'interface' from source: set_fact 28173 1726882780.06979: variable 'omit' from source: magic vars 28173 1726882780.07029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882780.07185: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882780.07210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882780.07231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882780.07253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882780.07305: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882780.07313: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882780.07320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882780.07440: Set connection var ansible_pipelining to False 28173 1726882780.07449: Set connection var ansible_shell_type to sh 28173 1726882780.07461: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882780.07574: Set connection var ansible_timeout to 10 28173 1726882780.07589: Set connection var ansible_shell_executable to /bin/sh 28173 1726882780.07599: Set connection var ansible_connection to ssh 28173 1726882780.07623: variable 'ansible_shell_executable' from source: unknown 28173 1726882780.07630: variable 'ansible_connection' from source: unknown 28173 1726882780.07636: variable 'ansible_module_compression' from source: unknown 28173 1726882780.07641: variable 'ansible_shell_type' from source: unknown 28173 1726882780.07648: variable 'ansible_shell_executable' from source: unknown 28173 1726882780.07654: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882780.07662: variable 'ansible_pipelining' from source: unknown 28173 1726882780.07671: variable 'ansible_timeout' from source: unknown 28173 1726882780.07680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882780.07823: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882780.07839: variable 'omit' from source: magic vars 28173 1726882780.07847: starting attempt loop 28173 1726882780.07853: running the handler 28173 1726882780.07873: _low_level_execute_command(): starting 28173 1726882780.07887: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882780.09740: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.09745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.09772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.09788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.09791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.09845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882780.09864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882780.10006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882780.11687: stdout chunk (state=3): >>>/root <<< 28173 1726882780.11857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882780.11873: stdout chunk (state=3): >>><<< 28173 1726882780.11878: stderr chunk (state=3): >>><<< 28173 1726882780.11991: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882780.11995: _low_level_execute_command(): starting 28173 1726882780.11997: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882780.118999-29623-104442320772671 `" && echo ansible-tmp-1726882780.118999-29623-104442320772671="` echo /root/.ansible/tmp/ansible-tmp-1726882780.118999-29623-104442320772671 `" ) && sleep 0' 28173 1726882780.12569: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882780.12580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.12590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.12602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.12637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.12648: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882780.12661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.12681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882780.12689: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882780.12695: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882780.12704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.12712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.12723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.12729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.12736: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882780.12744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.12825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882780.12840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882780.12851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882780.12995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882780.14901: stdout chunk (state=3): >>>ansible-tmp-1726882780.118999-29623-104442320772671=/root/.ansible/tmp/ansible-tmp-1726882780.118999-29623-104442320772671 <<< 28173 1726882780.15087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882780.15091: stdout chunk (state=3): >>><<< 28173 1726882780.15104: stderr chunk (state=3): >>><<< 28173 1726882780.15125: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882780.118999-29623-104442320772671=/root/.ansible/tmp/ansible-tmp-1726882780.118999-29623-104442320772671 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882780.15153: variable 'ansible_module_compression' from source: unknown 28173 1726882780.15222: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28173 1726882780.15276: variable 'ansible_facts' from source: unknown 28173 1726882780.15360: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882780.118999-29623-104442320772671/AnsiballZ_command.py 28173 1726882780.15526: Sending initial data 28173 1726882780.15536: Sent initial data (155 bytes) 28173 1726882780.16653: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882780.16672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.16687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.16714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.16774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.16794: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882780.16819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.16841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882780.16857: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882780.16876: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882780.16888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.16910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.16932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.16951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.16974: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882780.16990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.17085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882780.17104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882780.17122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882780.17349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882780.19092: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882780.19196: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882780.19299: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpfd0ofh5f /root/.ansible/tmp/ansible-tmp-1726882780.118999-29623-104442320772671/AnsiballZ_command.py <<< 28173 1726882780.19397: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882780.20734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882780.21039: stderr chunk (state=3): >>><<< 28173 1726882780.21042: stdout chunk (state=3): >>><<< 28173 1726882780.21045: done transferring module to remote 28173 1726882780.21047: _low_level_execute_command(): starting 28173 1726882780.21050: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882780.118999-29623-104442320772671/ /root/.ansible/tmp/ansible-tmp-1726882780.118999-29623-104442320772671/AnsiballZ_command.py && sleep 0' 28173 1726882780.22063: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882780.22096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.22111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.22133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.22250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.22273: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882780.22296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.22323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882780.22341: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882780.22362: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882780.22367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.22377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.22389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.22397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.22403: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882780.22413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.22486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882780.22499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882780.22547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882780.22771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882780.24802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882780.24899: stderr chunk (state=3): >>><<< 28173 1726882780.24910: stdout chunk (state=3): >>><<< 28173 1726882780.25000: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882780.25004: _low_level_execute_command(): starting 28173 1726882780.25007: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882780.118999-29623-104442320772671/AnsiballZ_command.py && sleep 0' 28173 1726882780.26456: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882780.26480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.26497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.26538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.26648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.26715: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882780.26731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.26790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882780.26838: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882780.26868: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882780.26927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.27021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.27090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.27103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.27114: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882780.27137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.27506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882780.27576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882780.27677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882780.27898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882780.42928: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 21:39:40.409614", "end": "2024-09-20 21:39:40.425061", "delta": "0:00:00.015447", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882780.44158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882780.44238: stderr chunk (state=3): >>><<< 28173 1726882780.44243: stdout chunk (state=3): >>><<< 28173 1726882780.44401: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 21:39:40.409614", "end": "2024-09-20 21:39:40.425061", "delta": "0:00:00.015447", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882780.44409: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882780.118999-29623-104442320772671/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882780.44411: _low_level_execute_command(): starting 28173 1726882780.44414: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882780.118999-29623-104442320772671/ > /dev/null 2>&1 && sleep 0' 28173 1726882780.45033: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882780.45057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.45080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.45099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.45141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.45152: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882780.45181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.45200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882780.45213: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882780.45228: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882780.45243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.45258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.45293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.45309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.45322: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882780.45337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.45429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882780.45445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882780.45459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882780.45601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882780.47461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882780.47467: stdout chunk (state=3): >>><<< 28173 1726882780.47478: stderr chunk (state=3): >>><<< 28173 1726882780.47493: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882780.47499: handler run complete 28173 1726882780.47526: Evaluated conditional (False): False 28173 1726882780.47537: attempt loop complete, returning result 28173 1726882780.47540: _execute() done 28173 1726882780.47542: dumping result to json 28173 1726882780.47544: done dumping result, returning 28173 1726882780.47557: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [0e448fcc-3ce9-926c-8928-000000000780] 28173 1726882780.47559: sending task result for task 0e448fcc-3ce9-926c-8928-000000000780 28173 1726882780.47669: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000780 28173 1726882780.47672: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.015447", "end": "2024-09-20 21:39:40.425061", "rc": 0, "start": "2024-09-20 21:39:40.409614" } 28173 1726882780.47734: no more pending results, returning what we have 28173 1726882780.47738: results queue empty 28173 1726882780.47738: checking for any_errors_fatal 28173 1726882780.47740: done checking for any_errors_fatal 28173 1726882780.47741: checking for max_fail_percentage 28173 1726882780.47742: done checking for max_fail_percentage 28173 1726882780.47743: checking to see if all hosts have failed and the running result is not ok 28173 1726882780.47744: done checking to see if all hosts have failed 28173 1726882780.47745: getting the remaining hosts for this loop 28173 1726882780.47746: done getting the remaining hosts for this loop 28173 1726882780.47749: getting the next task for host managed_node2 28173 1726882780.47757: done getting next task for host managed_node2 28173 1726882780.47760: ^ task is: TASK: meta (flush_handlers) 28173 1726882780.47762: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882780.47768: getting variables 28173 1726882780.47770: in VariableManager get_vars() 28173 1726882780.47803: Calling all_inventory to load vars for managed_node2 28173 1726882780.47806: Calling groups_inventory to load vars for managed_node2 28173 1726882780.47809: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882780.47820: Calling all_plugins_play to load vars for managed_node2 28173 1726882780.47822: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882780.47825: Calling groups_plugins_play to load vars for managed_node2 28173 1726882780.49838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882780.52410: done with get_vars() 28173 1726882780.52439: done getting variables 28173 1726882780.52573: in VariableManager get_vars() 28173 1726882780.52583: Calling all_inventory to load vars for managed_node2 28173 1726882780.52585: Calling groups_inventory to load vars for managed_node2 28173 1726882780.52587: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882780.52592: Calling all_plugins_play to load vars for managed_node2 28173 1726882780.52594: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882780.52597: Calling groups_plugins_play to load vars for managed_node2 28173 1726882780.54356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882780.57273: done with get_vars() 28173 1726882780.57304: done queuing things up, now waiting for results queue to drain 28173 1726882780.57306: results queue empty 28173 1726882780.57307: checking for any_errors_fatal 28173 1726882780.57310: done checking for any_errors_fatal 28173 1726882780.57311: checking for max_fail_percentage 28173 1726882780.57312: done checking for max_fail_percentage 28173 1726882780.57313: checking to see if all hosts have failed and the running result is not ok 28173 1726882780.57314: done checking to see if all hosts have failed 28173 1726882780.57314: getting the remaining hosts for this loop 28173 1726882780.57315: done getting the remaining hosts for this loop 28173 1726882780.57318: getting the next task for host managed_node2 28173 1726882780.57357: done getting next task for host managed_node2 28173 1726882780.57359: ^ task is: TASK: meta (flush_handlers) 28173 1726882780.57361: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882780.57367: getting variables 28173 1726882780.57369: in VariableManager get_vars() 28173 1726882780.57378: Calling all_inventory to load vars for managed_node2 28173 1726882780.57381: Calling groups_inventory to load vars for managed_node2 28173 1726882780.57383: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882780.57388: Calling all_plugins_play to load vars for managed_node2 28173 1726882780.57391: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882780.57394: Calling groups_plugins_play to load vars for managed_node2 28173 1726882780.58789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882780.60730: done with get_vars() 28173 1726882780.60752: done getting variables 28173 1726882780.60815: in VariableManager get_vars() 28173 1726882780.60824: Calling all_inventory to load vars for managed_node2 28173 1726882780.60827: Calling groups_inventory to load vars for managed_node2 28173 1726882780.60829: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882780.60834: Calling all_plugins_play to load vars for managed_node2 28173 1726882780.60836: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882780.60839: Calling groups_plugins_play to load vars for managed_node2 28173 1726882780.62245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882780.64135: done with get_vars() 28173 1726882780.64162: done queuing things up, now waiting for results queue to drain 28173 1726882780.64167: results queue empty 28173 1726882780.64168: checking for any_errors_fatal 28173 1726882780.64170: done checking for any_errors_fatal 28173 1726882780.64170: checking for max_fail_percentage 28173 1726882780.64172: done checking for max_fail_percentage 28173 1726882780.64172: checking to see if all hosts have failed and the running result is not ok 28173 1726882780.64173: done checking to see if all hosts have failed 28173 1726882780.64179: getting the remaining hosts for this loop 28173 1726882780.64181: done getting the remaining hosts for this loop 28173 1726882780.64184: getting the next task for host managed_node2 28173 1726882780.64190: done getting next task for host managed_node2 28173 1726882780.64191: ^ task is: None 28173 1726882780.64193: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882780.64194: done queuing things up, now waiting for results queue to drain 28173 1726882780.64195: results queue empty 28173 1726882780.64196: checking for any_errors_fatal 28173 1726882780.64197: done checking for any_errors_fatal 28173 1726882780.64198: checking for max_fail_percentage 28173 1726882780.64199: done checking for max_fail_percentage 28173 1726882780.64199: checking to see if all hosts have failed and the running result is not ok 28173 1726882780.64200: done checking to see if all hosts have failed 28173 1726882780.64201: getting the next task for host managed_node2 28173 1726882780.64203: done getting next task for host managed_node2 28173 1726882780.64204: ^ task is: None 28173 1726882780.64205: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882780.64244: in VariableManager get_vars() 28173 1726882780.64267: done with get_vars() 28173 1726882780.64296: in VariableManager get_vars() 28173 1726882780.64312: done with get_vars() 28173 1726882780.64317: variable 'omit' from source: magic vars 28173 1726882780.64472: variable 'profile' from source: play vars 28173 1726882780.64587: in VariableManager get_vars() 28173 1726882780.64601: done with get_vars() 28173 1726882780.64631: variable 'omit' from source: magic vars 28173 1726882780.64697: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 28173 1726882780.65424: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28173 1726882780.65447: getting the remaining hosts for this loop 28173 1726882780.65448: done getting the remaining hosts for this loop 28173 1726882780.65450: getting the next task for host managed_node2 28173 1726882780.65453: done getting next task for host managed_node2 28173 1726882780.65455: ^ task is: TASK: Gathering Facts 28173 1726882780.65457: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882780.65459: getting variables 28173 1726882780.65460: in VariableManager get_vars() 28173 1726882780.65472: Calling all_inventory to load vars for managed_node2 28173 1726882780.65477: Calling groups_inventory to load vars for managed_node2 28173 1726882780.65479: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882780.65484: Calling all_plugins_play to load vars for managed_node2 28173 1726882780.65488: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882780.65491: Calling groups_plugins_play to load vars for managed_node2 28173 1726882780.67014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882780.70151: done with get_vars() 28173 1726882780.70205: done getting variables 28173 1726882780.70257: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 21:39:40 -0400 (0:00:00.680) 0:00:33.867 ****** 28173 1726882780.70285: entering _queue_task() for managed_node2/gather_facts 28173 1726882780.72199: worker is 1 (out of 1 available) 28173 1726882780.72209: exiting _queue_task() for managed_node2/gather_facts 28173 1726882780.72222: done queuing things up, now waiting for results queue to drain 28173 1726882780.72224: waiting for pending results... 28173 1726882780.72644: running TaskExecutor() for managed_node2/TASK: Gathering Facts 28173 1726882780.72761: in run() - task 0e448fcc-3ce9-926c-8928-00000000078e 28173 1726882780.72794: variable 'ansible_search_path' from source: unknown 28173 1726882780.72844: calling self._execute() 28173 1726882780.72978: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882780.72997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882780.73015: variable 'omit' from source: magic vars 28173 1726882780.73443: variable 'ansible_distribution_major_version' from source: facts 28173 1726882780.73469: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882780.73482: variable 'omit' from source: magic vars 28173 1726882780.73511: variable 'omit' from source: magic vars 28173 1726882780.73556: variable 'omit' from source: magic vars 28173 1726882780.73611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882780.73658: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882780.73690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882780.73711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882780.73728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882780.73768: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882780.73782: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882780.73794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882780.73910: Set connection var ansible_pipelining to False 28173 1726882780.73918: Set connection var ansible_shell_type to sh 28173 1726882780.73930: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882780.73943: Set connection var ansible_timeout to 10 28173 1726882780.73953: Set connection var ansible_shell_executable to /bin/sh 28173 1726882780.73975: Set connection var ansible_connection to ssh 28173 1726882780.74032: variable 'ansible_shell_executable' from source: unknown 28173 1726882780.74041: variable 'ansible_connection' from source: unknown 28173 1726882780.74058: variable 'ansible_module_compression' from source: unknown 28173 1726882780.74076: variable 'ansible_shell_type' from source: unknown 28173 1726882780.74117: variable 'ansible_shell_executable' from source: unknown 28173 1726882780.74143: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882780.74152: variable 'ansible_pipelining' from source: unknown 28173 1726882780.74158: variable 'ansible_timeout' from source: unknown 28173 1726882780.74167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882780.74452: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882780.74473: variable 'omit' from source: magic vars 28173 1726882780.74483: starting attempt loop 28173 1726882780.74488: running the handler 28173 1726882780.74508: variable 'ansible_facts' from source: unknown 28173 1726882780.74537: _low_level_execute_command(): starting 28173 1726882780.74562: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882780.75373: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882780.75394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.75409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.75430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.75482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.75499: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882780.75515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.75537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882780.75549: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882780.75562: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882780.75577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.75590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.75610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.75623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.75637: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882780.75650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.75737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882780.75762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882780.75782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882780.75915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882780.77554: stdout chunk (state=3): >>>/root <<< 28173 1726882780.77735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882780.77738: stdout chunk (state=3): >>><<< 28173 1726882780.77740: stderr chunk (state=3): >>><<< 28173 1726882780.77848: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882780.77851: _low_level_execute_command(): starting 28173 1726882780.77855: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882780.7775738-29653-207611463294921 `" && echo ansible-tmp-1726882780.7775738-29653-207611463294921="` echo /root/.ansible/tmp/ansible-tmp-1726882780.7775738-29653-207611463294921 `" ) && sleep 0' 28173 1726882780.79625: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.79629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.79665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.79670: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.79673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.79761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882780.79767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882780.79854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882780.81739: stdout chunk (state=3): >>>ansible-tmp-1726882780.7775738-29653-207611463294921=/root/.ansible/tmp/ansible-tmp-1726882780.7775738-29653-207611463294921 <<< 28173 1726882780.81918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882780.81921: stdout chunk (state=3): >>><<< 28173 1726882780.81928: stderr chunk (state=3): >>><<< 28173 1726882780.82070: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882780.7775738-29653-207611463294921=/root/.ansible/tmp/ansible-tmp-1726882780.7775738-29653-207611463294921 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882780.82074: variable 'ansible_module_compression' from source: unknown 28173 1726882780.82076: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28173 1726882780.82091: variable 'ansible_facts' from source: unknown 28173 1726882780.82254: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882780.7775738-29653-207611463294921/AnsiballZ_setup.py 28173 1726882780.83058: Sending initial data 28173 1726882780.83071: Sent initial data (154 bytes) 28173 1726882780.84224: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.84228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.84387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.84390: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.84448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882780.84585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882780.84589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882780.84700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882780.86506: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882780.86602: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882780.86700: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpf1kfek9k /root/.ansible/tmp/ansible-tmp-1726882780.7775738-29653-207611463294921/AnsiballZ_setup.py <<< 28173 1726882780.86798: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882780.89900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882780.90155: stderr chunk (state=3): >>><<< 28173 1726882780.90159: stdout chunk (state=3): >>><<< 28173 1726882780.90161: done transferring module to remote 28173 1726882780.90162: _low_level_execute_command(): starting 28173 1726882780.90180: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882780.7775738-29653-207611463294921/ /root/.ansible/tmp/ansible-tmp-1726882780.7775738-29653-207611463294921/AnsiballZ_setup.py && sleep 0' 28173 1726882780.91646: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882780.91657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.91675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.91694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.91739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.91749: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882780.91760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.91784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882780.91799: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882780.91816: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882780.91833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.91845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882780.91858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.91874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882780.91885: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882780.91897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.91985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882780.92001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882780.92014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882780.92146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882780.93986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882780.94056: stderr chunk (state=3): >>><<< 28173 1726882780.94059: stdout chunk (state=3): >>><<< 28173 1726882780.94156: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882780.94160: _low_level_execute_command(): starting 28173 1726882780.94163: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882780.7775738-29653-207611463294921/AnsiballZ_setup.py && sleep 0' 28173 1726882780.94696: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882780.94700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882780.94733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.94736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882780.94738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882780.94787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882780.94791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882780.94902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882781.47257: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_local": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "41", "epoch": "1726882781", "epoch_int": "1726882781", "date": "2024-09-20", "time": "21:39:41", "iso8601_micro": "2024-09-21T01:39:41.200441Z", "iso8601": "2024-09-21T01:39:41Z", "iso8601_basic": "20240920T213941200441", "iso8601_basic_short": "20240920T213941", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2777, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 755, "free": 2777}, "nocache": {"free": 3241, "used": 291}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_produ<<< 28173 1726882781.47297: stdout chunk (state=3): >>>ct_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 720, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238379008, "block_size": 4096, "block_total": 65519355, "block_available": 64511323, "block_used": 1008032, "inode_total": 131071472, "inode_available": 130998691, "inode_used": 72781, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.51, "5m": 0.43, "15m": 0.26}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28173 1726882781.48883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882781.48975: stderr chunk (state=3): >>><<< 28173 1726882781.48979: stdout chunk (state=3): >>><<< 28173 1726882781.49172: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_local": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "41", "epoch": "1726882781", "epoch_int": "1726882781", "date": "2024-09-20", "time": "21:39:41", "iso8601_micro": "2024-09-21T01:39:41.200441Z", "iso8601": "2024-09-21T01:39:41Z", "iso8601_basic": "20240920T213941200441", "iso8601_basic_short": "20240920T213941", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2777, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 755, "free": 2777}, "nocache": {"free": 3241, "used": 291}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 720, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238379008, "block_size": 4096, "block_total": 65519355, "block_available": 64511323, "block_used": 1008032, "inode_total": 131071472, "inode_available": 130998691, "inode_used": 72781, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.51, "5m": 0.43, "15m": 0.26}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882781.49456: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882780.7775738-29653-207611463294921/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882781.49491: _low_level_execute_command(): starting 28173 1726882781.49502: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882780.7775738-29653-207611463294921/ > /dev/null 2>&1 && sleep 0' 28173 1726882781.50202: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882781.50217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882781.50233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882781.50255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882781.50304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882781.50316: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882781.50331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882781.50352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882781.50370: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882781.50383: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882781.50396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882781.50411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882781.50428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882781.50441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882781.50453: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882781.50476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882781.50551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882781.50574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882781.50590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882781.50724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882781.52517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882781.52621: stderr chunk (state=3): >>><<< 28173 1726882781.52633: stdout chunk (state=3): >>><<< 28173 1726882781.52937: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882781.52972: handler run complete 28173 1726882781.53177: variable 'ansible_facts' from source: unknown 28173 1726882781.53205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882781.54555: variable 'ansible_facts' from source: unknown 28173 1726882781.54671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882781.56118: attempt loop complete, returning result 28173 1726882781.56181: _execute() done 28173 1726882781.56189: dumping result to json 28173 1726882781.56289: done dumping result, returning 28173 1726882781.56371: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-926c-8928-00000000078e] 28173 1726882781.56384: sending task result for task 0e448fcc-3ce9-926c-8928-00000000078e ok: [managed_node2] 28173 1726882781.57318: no more pending results, returning what we have 28173 1726882781.57322: results queue empty 28173 1726882781.57323: checking for any_errors_fatal 28173 1726882781.57324: done checking for any_errors_fatal 28173 1726882781.57325: checking for max_fail_percentage 28173 1726882781.57326: done checking for max_fail_percentage 28173 1726882781.57327: checking to see if all hosts have failed and the running result is not ok 28173 1726882781.57328: done checking to see if all hosts have failed 28173 1726882781.57329: getting the remaining hosts for this loop 28173 1726882781.57331: done getting the remaining hosts for this loop 28173 1726882781.57334: getting the next task for host managed_node2 28173 1726882781.57340: done getting next task for host managed_node2 28173 1726882781.57342: ^ task is: TASK: meta (flush_handlers) 28173 1726882781.57345: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882781.57349: getting variables 28173 1726882781.57351: in VariableManager get_vars() 28173 1726882781.57395: Calling all_inventory to load vars for managed_node2 28173 1726882781.57398: Calling groups_inventory to load vars for managed_node2 28173 1726882781.57401: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882781.57413: Calling all_plugins_play to load vars for managed_node2 28173 1726882781.57416: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882781.57419: Calling groups_plugins_play to load vars for managed_node2 28173 1726882781.58215: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000078e 28173 1726882781.58219: WORKER PROCESS EXITING 28173 1726882781.59578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882781.63887: done with get_vars() 28173 1726882781.63922: done getting variables 28173 1726882781.64072: in VariableManager get_vars() 28173 1726882781.64086: Calling all_inventory to load vars for managed_node2 28173 1726882781.64089: Calling groups_inventory to load vars for managed_node2 28173 1726882781.64091: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882781.64096: Calling all_plugins_play to load vars for managed_node2 28173 1726882781.64098: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882781.64106: Calling groups_plugins_play to load vars for managed_node2 28173 1726882781.65593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882781.68936: done with get_vars() 28173 1726882781.68971: done queuing things up, now waiting for results queue to drain 28173 1726882781.68974: results queue empty 28173 1726882781.68975: checking for any_errors_fatal 28173 1726882781.68979: done checking for any_errors_fatal 28173 1726882781.68980: checking for max_fail_percentage 28173 1726882781.68981: done checking for max_fail_percentage 28173 1726882781.68982: checking to see if all hosts have failed and the running result is not ok 28173 1726882781.68982: done checking to see if all hosts have failed 28173 1726882781.68983: getting the remaining hosts for this loop 28173 1726882781.68984: done getting the remaining hosts for this loop 28173 1726882781.68987: getting the next task for host managed_node2 28173 1726882781.68991: done getting next task for host managed_node2 28173 1726882781.68994: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28173 1726882781.68995: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882781.69005: getting variables 28173 1726882781.69006: in VariableManager get_vars() 28173 1726882781.69021: Calling all_inventory to load vars for managed_node2 28173 1726882781.69023: Calling groups_inventory to load vars for managed_node2 28173 1726882781.69024: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882781.69029: Calling all_plugins_play to load vars for managed_node2 28173 1726882781.69037: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882781.69041: Calling groups_plugins_play to load vars for managed_node2 28173 1726882781.70400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882781.73177: done with get_vars() 28173 1726882781.73205: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:39:41 -0400 (0:00:01.030) 0:00:34.897 ****** 28173 1726882781.73295: entering _queue_task() for managed_node2/include_tasks 28173 1726882781.73651: worker is 1 (out of 1 available) 28173 1726882781.73669: exiting _queue_task() for managed_node2/include_tasks 28173 1726882781.73681: done queuing things up, now waiting for results queue to drain 28173 1726882781.73683: waiting for pending results... 28173 1726882781.73978: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28173 1726882781.74095: in run() - task 0e448fcc-3ce9-926c-8928-0000000000d7 28173 1726882781.74115: variable 'ansible_search_path' from source: unknown 28173 1726882781.74124: variable 'ansible_search_path' from source: unknown 28173 1726882781.74172: calling self._execute() 28173 1726882781.74278: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882781.74288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882781.74299: variable 'omit' from source: magic vars 28173 1726882781.74802: variable 'ansible_distribution_major_version' from source: facts 28173 1726882781.74821: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882781.74833: _execute() done 28173 1726882781.74841: dumping result to json 28173 1726882781.74848: done dumping result, returning 28173 1726882781.74858: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-926c-8928-0000000000d7] 28173 1726882781.74876: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000d7 28173 1726882781.75030: no more pending results, returning what we have 28173 1726882781.75036: in VariableManager get_vars() 28173 1726882781.75086: Calling all_inventory to load vars for managed_node2 28173 1726882781.75089: Calling groups_inventory to load vars for managed_node2 28173 1726882781.75091: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882781.75104: Calling all_plugins_play to load vars for managed_node2 28173 1726882781.75107: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882781.75109: Calling groups_plugins_play to load vars for managed_node2 28173 1726882781.76252: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000d7 28173 1726882781.76256: WORKER PROCESS EXITING 28173 1726882781.77989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882781.80865: done with get_vars() 28173 1726882781.80891: variable 'ansible_search_path' from source: unknown 28173 1726882781.80892: variable 'ansible_search_path' from source: unknown 28173 1726882781.80922: we have included files to process 28173 1726882781.80923: generating all_blocks data 28173 1726882781.80924: done generating all_blocks data 28173 1726882781.80925: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28173 1726882781.80926: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28173 1726882781.80928: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28173 1726882781.82279: done processing included file 28173 1726882781.82281: iterating over new_blocks loaded from include file 28173 1726882781.82282: in VariableManager get_vars() 28173 1726882781.82304: done with get_vars() 28173 1726882781.82306: filtering new block on tags 28173 1726882781.82322: done filtering new block on tags 28173 1726882781.82324: in VariableManager get_vars() 28173 1726882781.82345: done with get_vars() 28173 1726882781.82347: filtering new block on tags 28173 1726882781.82486: done filtering new block on tags 28173 1726882781.82489: in VariableManager get_vars() 28173 1726882781.82508: done with get_vars() 28173 1726882781.82510: filtering new block on tags 28173 1726882781.82526: done filtering new block on tags 28173 1726882781.82528: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28173 1726882781.82533: extending task lists for all hosts with included blocks 28173 1726882781.83393: done extending task lists 28173 1726882781.83394: done processing included files 28173 1726882781.83395: results queue empty 28173 1726882781.83396: checking for any_errors_fatal 28173 1726882781.83397: done checking for any_errors_fatal 28173 1726882781.83398: checking for max_fail_percentage 28173 1726882781.83399: done checking for max_fail_percentage 28173 1726882781.83400: checking to see if all hosts have failed and the running result is not ok 28173 1726882781.83401: done checking to see if all hosts have failed 28173 1726882781.83401: getting the remaining hosts for this loop 28173 1726882781.83403: done getting the remaining hosts for this loop 28173 1726882781.83406: getting the next task for host managed_node2 28173 1726882781.83410: done getting next task for host managed_node2 28173 1726882781.83412: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28173 1726882781.83415: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882781.83424: getting variables 28173 1726882781.83425: in VariableManager get_vars() 28173 1726882781.83439: Calling all_inventory to load vars for managed_node2 28173 1726882781.83442: Calling groups_inventory to load vars for managed_node2 28173 1726882781.83559: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882781.83572: Calling all_plugins_play to load vars for managed_node2 28173 1726882781.83577: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882781.83581: Calling groups_plugins_play to load vars for managed_node2 28173 1726882781.86548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882781.90665: done with get_vars() 28173 1726882781.90693: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:39:41 -0400 (0:00:00.176) 0:00:35.073 ****** 28173 1726882781.90901: entering _queue_task() for managed_node2/setup 28173 1726882781.91699: worker is 1 (out of 1 available) 28173 1726882781.91830: exiting _queue_task() for managed_node2/setup 28173 1726882781.91841: done queuing things up, now waiting for results queue to drain 28173 1726882781.91843: waiting for pending results... 28173 1726882781.92740: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28173 1726882781.93055: in run() - task 0e448fcc-3ce9-926c-8928-0000000007cf 28173 1726882781.93221: variable 'ansible_search_path' from source: unknown 28173 1726882781.93239: variable 'ansible_search_path' from source: unknown 28173 1726882781.93300: calling self._execute() 28173 1726882781.93511: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882781.93655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882781.93684: variable 'omit' from source: magic vars 28173 1726882781.94496: variable 'ansible_distribution_major_version' from source: facts 28173 1726882781.94654: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882781.95048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882782.01364: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882782.01558: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882782.01667: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882782.01811: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882782.01850: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882782.01993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882782.02141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882782.02180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882782.02227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882782.02298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882782.02413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882782.02488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882782.02614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882782.02659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882782.02710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882782.03238: variable '__network_required_facts' from source: role '' defaults 28173 1726882782.03261: variable 'ansible_facts' from source: unknown 28173 1726882782.06230: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28173 1726882782.06397: when evaluation is False, skipping this task 28173 1726882782.06404: _execute() done 28173 1726882782.06410: dumping result to json 28173 1726882782.06416: done dumping result, returning 28173 1726882782.06426: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-926c-8928-0000000007cf] 28173 1726882782.06435: sending task result for task 0e448fcc-3ce9-926c-8928-0000000007cf skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882782.06583: no more pending results, returning what we have 28173 1726882782.06588: results queue empty 28173 1726882782.06589: checking for any_errors_fatal 28173 1726882782.06591: done checking for any_errors_fatal 28173 1726882782.06592: checking for max_fail_percentage 28173 1726882782.06593: done checking for max_fail_percentage 28173 1726882782.06594: checking to see if all hosts have failed and the running result is not ok 28173 1726882782.06595: done checking to see if all hosts have failed 28173 1726882782.06596: getting the remaining hosts for this loop 28173 1726882782.06597: done getting the remaining hosts for this loop 28173 1726882782.06601: getting the next task for host managed_node2 28173 1726882782.06610: done getting next task for host managed_node2 28173 1726882782.06614: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28173 1726882782.06617: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882782.06630: getting variables 28173 1726882782.06632: in VariableManager get_vars() 28173 1726882782.06680: Calling all_inventory to load vars for managed_node2 28173 1726882782.06683: Calling groups_inventory to load vars for managed_node2 28173 1726882782.06686: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882782.06697: Calling all_plugins_play to load vars for managed_node2 28173 1726882782.06700: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882782.06703: Calling groups_plugins_play to load vars for managed_node2 28173 1726882782.07724: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000007cf 28173 1726882782.07727: WORKER PROCESS EXITING 28173 1726882782.10006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882782.13552: done with get_vars() 28173 1726882782.13594: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:39:42 -0400 (0:00:00.229) 0:00:35.302 ****** 28173 1726882782.13825: entering _queue_task() for managed_node2/stat 28173 1726882782.14233: worker is 1 (out of 1 available) 28173 1726882782.14245: exiting _queue_task() for managed_node2/stat 28173 1726882782.14257: done queuing things up, now waiting for results queue to drain 28173 1726882782.14258: waiting for pending results... 28173 1726882782.15452: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28173 1726882782.15571: in run() - task 0e448fcc-3ce9-926c-8928-0000000007d1 28173 1726882782.15583: variable 'ansible_search_path' from source: unknown 28173 1726882782.15586: variable 'ansible_search_path' from source: unknown 28173 1726882782.15622: calling self._execute() 28173 1726882782.15714: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882782.15719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882782.15728: variable 'omit' from source: magic vars 28173 1726882782.16902: variable 'ansible_distribution_major_version' from source: facts 28173 1726882782.16915: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882782.17282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882782.17758: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882782.17826: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882782.17841: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882782.17872: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882782.18152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882782.18175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882782.18200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882782.19260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882782.19347: variable '__network_is_ostree' from source: set_fact 28173 1726882782.19378: Evaluated conditional (not __network_is_ostree is defined): False 28173 1726882782.19381: when evaluation is False, skipping this task 28173 1726882782.19384: _execute() done 28173 1726882782.19386: dumping result to json 28173 1726882782.19389: done dumping result, returning 28173 1726882782.19391: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-926c-8928-0000000007d1] 28173 1726882782.19396: sending task result for task 0e448fcc-3ce9-926c-8928-0000000007d1 28173 1726882782.19502: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000007d1 28173 1726882782.19505: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28173 1726882782.19576: no more pending results, returning what we have 28173 1726882782.19581: results queue empty 28173 1726882782.19582: checking for any_errors_fatal 28173 1726882782.19588: done checking for any_errors_fatal 28173 1726882782.19589: checking for max_fail_percentage 28173 1726882782.19590: done checking for max_fail_percentage 28173 1726882782.19592: checking to see if all hosts have failed and the running result is not ok 28173 1726882782.19593: done checking to see if all hosts have failed 28173 1726882782.19594: getting the remaining hosts for this loop 28173 1726882782.19595: done getting the remaining hosts for this loop 28173 1726882782.19599: getting the next task for host managed_node2 28173 1726882782.19605: done getting next task for host managed_node2 28173 1726882782.19610: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28173 1726882782.19613: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882782.19631: getting variables 28173 1726882782.19636: in VariableManager get_vars() 28173 1726882782.19685: Calling all_inventory to load vars for managed_node2 28173 1726882782.19688: Calling groups_inventory to load vars for managed_node2 28173 1726882782.19690: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882782.19701: Calling all_plugins_play to load vars for managed_node2 28173 1726882782.19703: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882782.19705: Calling groups_plugins_play to load vars for managed_node2 28173 1726882782.23205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882782.27637: done with get_vars() 28173 1726882782.27672: done getting variables 28173 1726882782.27849: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:39:42 -0400 (0:00:00.140) 0:00:35.443 ****** 28173 1726882782.27890: entering _queue_task() for managed_node2/set_fact 28173 1726882782.28555: worker is 1 (out of 1 available) 28173 1726882782.28570: exiting _queue_task() for managed_node2/set_fact 28173 1726882782.28583: done queuing things up, now waiting for results queue to drain 28173 1726882782.28584: waiting for pending results... 28173 1726882782.29843: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28173 1726882782.30034: in run() - task 0e448fcc-3ce9-926c-8928-0000000007d2 28173 1726882782.30061: variable 'ansible_search_path' from source: unknown 28173 1726882782.30068: variable 'ansible_search_path' from source: unknown 28173 1726882782.30096: calling self._execute() 28173 1726882782.30271: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882782.30281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882782.30353: variable 'omit' from source: magic vars 28173 1726882782.31486: variable 'ansible_distribution_major_version' from source: facts 28173 1726882782.31499: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882782.32016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882782.32705: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882782.32793: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882782.32921: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882782.32997: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882782.33081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882782.33358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882782.33435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882782.33473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882782.33723: variable '__network_is_ostree' from source: set_fact 28173 1726882782.33731: Evaluated conditional (not __network_is_ostree is defined): False 28173 1726882782.33734: when evaluation is False, skipping this task 28173 1726882782.33737: _execute() done 28173 1726882782.33739: dumping result to json 28173 1726882782.33746: done dumping result, returning 28173 1726882782.33978: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-926c-8928-0000000007d2] 28173 1726882782.33982: sending task result for task 0e448fcc-3ce9-926c-8928-0000000007d2 28173 1726882782.34104: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000007d2 28173 1726882782.34107: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28173 1726882782.34178: no more pending results, returning what we have 28173 1726882782.34185: results queue empty 28173 1726882782.34189: checking for any_errors_fatal 28173 1726882782.34195: done checking for any_errors_fatal 28173 1726882782.34196: checking for max_fail_percentage 28173 1726882782.34197: done checking for max_fail_percentage 28173 1726882782.34198: checking to see if all hosts have failed and the running result is not ok 28173 1726882782.34199: done checking to see if all hosts have failed 28173 1726882782.34200: getting the remaining hosts for this loop 28173 1726882782.34201: done getting the remaining hosts for this loop 28173 1726882782.34204: getting the next task for host managed_node2 28173 1726882782.34214: done getting next task for host managed_node2 28173 1726882782.34217: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28173 1726882782.34220: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882782.34233: getting variables 28173 1726882782.34235: in VariableManager get_vars() 28173 1726882782.34279: Calling all_inventory to load vars for managed_node2 28173 1726882782.34285: Calling groups_inventory to load vars for managed_node2 28173 1726882782.34289: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882782.34309: Calling all_plugins_play to load vars for managed_node2 28173 1726882782.34315: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882782.34323: Calling groups_plugins_play to load vars for managed_node2 28173 1726882782.38508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882782.45047: done with get_vars() 28173 1726882782.45202: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:39:42 -0400 (0:00:00.174) 0:00:35.617 ****** 28173 1726882782.45337: entering _queue_task() for managed_node2/service_facts 28173 1726882782.46860: worker is 1 (out of 1 available) 28173 1726882782.46878: exiting _queue_task() for managed_node2/service_facts 28173 1726882782.46890: done queuing things up, now waiting for results queue to drain 28173 1726882782.46891: waiting for pending results... 28173 1726882782.48698: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28173 1726882782.49898: in run() - task 0e448fcc-3ce9-926c-8928-0000000007d4 28173 1726882782.50004: variable 'ansible_search_path' from source: unknown 28173 1726882782.50012: variable 'ansible_search_path' from source: unknown 28173 1726882782.50060: calling self._execute() 28173 1726882782.50344: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882782.50376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882782.50394: variable 'omit' from source: magic vars 28173 1726882782.51172: variable 'ansible_distribution_major_version' from source: facts 28173 1726882782.51240: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882782.51282: variable 'omit' from source: magic vars 28173 1726882782.51396: variable 'omit' from source: magic vars 28173 1726882782.51589: variable 'omit' from source: magic vars 28173 1726882782.51736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882782.51779: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882782.51861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882782.51936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882782.51991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882782.52057: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882782.52132: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882782.52141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882782.52323: Set connection var ansible_pipelining to False 28173 1726882782.52378: Set connection var ansible_shell_type to sh 28173 1726882782.52399: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882782.52462: Set connection var ansible_timeout to 10 28173 1726882782.52478: Set connection var ansible_shell_executable to /bin/sh 28173 1726882782.52487: Set connection var ansible_connection to ssh 28173 1726882782.52517: variable 'ansible_shell_executable' from source: unknown 28173 1726882782.52570: variable 'ansible_connection' from source: unknown 28173 1726882782.52581: variable 'ansible_module_compression' from source: unknown 28173 1726882782.52589: variable 'ansible_shell_type' from source: unknown 28173 1726882782.52597: variable 'ansible_shell_executable' from source: unknown 28173 1726882782.52612: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882782.52679: variable 'ansible_pipelining' from source: unknown 28173 1726882782.52687: variable 'ansible_timeout' from source: unknown 28173 1726882782.52694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882782.53124: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882782.53165: variable 'omit' from source: magic vars 28173 1726882782.53222: starting attempt loop 28173 1726882782.53230: running the handler 28173 1726882782.53248: _low_level_execute_command(): starting 28173 1726882782.53271: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882782.56711: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882782.56857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882782.56879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882782.56899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882782.56946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882782.56962: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882782.56983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882782.57003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882782.57015: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882782.57032: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882782.57046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882782.57064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882782.57089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882782.57188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882782.57204: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882782.57219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882782.57306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882782.57409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882782.57432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882782.57576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882782.59249: stdout chunk (state=3): >>>/root <<< 28173 1726882782.59392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882782.59471: stderr chunk (state=3): >>><<< 28173 1726882782.59474: stdout chunk (state=3): >>><<< 28173 1726882782.59596: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882782.59600: _low_level_execute_command(): starting 28173 1726882782.59603: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882782.5949605-29746-99081874755898 `" && echo ansible-tmp-1726882782.5949605-29746-99081874755898="` echo /root/.ansible/tmp/ansible-tmp-1726882782.5949605-29746-99081874755898 `" ) && sleep 0' 28173 1726882782.60970: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882782.61120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882782.61135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882782.61152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882782.61200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882782.61215: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882782.61233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882782.61252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882782.61269: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882782.61283: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882782.61296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882782.61310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882782.61329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882782.61346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882782.61357: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882782.61377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882782.61515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882782.61573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882782.61593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882782.61725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882782.63635: stdout chunk (state=3): >>>ansible-tmp-1726882782.5949605-29746-99081874755898=/root/.ansible/tmp/ansible-tmp-1726882782.5949605-29746-99081874755898 <<< 28173 1726882782.63784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882782.63861: stderr chunk (state=3): >>><<< 28173 1726882782.63869: stdout chunk (state=3): >>><<< 28173 1726882782.64075: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882782.5949605-29746-99081874755898=/root/.ansible/tmp/ansible-tmp-1726882782.5949605-29746-99081874755898 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882782.64078: variable 'ansible_module_compression' from source: unknown 28173 1726882782.64081: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28173 1726882782.64083: variable 'ansible_facts' from source: unknown 28173 1726882782.64123: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882782.5949605-29746-99081874755898/AnsiballZ_service_facts.py 28173 1726882782.64797: Sending initial data 28173 1726882782.64800: Sent initial data (161 bytes) 28173 1726882782.68936: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882782.68939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882782.68961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882782.69273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882782.69389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882782.69529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882782.71331: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882782.71428: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882782.71530: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmph4dner_8 /root/.ansible/tmp/ansible-tmp-1726882782.5949605-29746-99081874755898/AnsiballZ_service_facts.py <<< 28173 1726882782.71622: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882782.73178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882782.73305: stderr chunk (state=3): >>><<< 28173 1726882782.73308: stdout chunk (state=3): >>><<< 28173 1726882782.73327: done transferring module to remote 28173 1726882782.73339: _low_level_execute_command(): starting 28173 1726882782.73344: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882782.5949605-29746-99081874755898/ /root/.ansible/tmp/ansible-tmp-1726882782.5949605-29746-99081874755898/AnsiballZ_service_facts.py && sleep 0' 28173 1726882782.75746: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882782.75893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882782.75904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882782.75918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882782.76012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882782.76020: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882782.76030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882782.76044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882782.76051: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882782.76058: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882782.76069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882782.76078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882782.76111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882782.76119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882782.76125: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882782.76135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882782.76205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882782.76331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882782.76443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882782.76656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882782.78473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882782.78477: stdout chunk (state=3): >>><<< 28173 1726882782.78485: stderr chunk (state=3): >>><<< 28173 1726882782.78503: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882782.78506: _low_level_execute_command(): starting 28173 1726882782.78512: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882782.5949605-29746-99081874755898/AnsiballZ_service_facts.py && sleep 0' 28173 1726882782.80736: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882782.80871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882782.80879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882782.80894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882782.80932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882782.80973: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882782.81082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882782.81096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882782.81182: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882782.81190: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882782.81200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882782.81210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882782.81222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882782.81230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882782.81237: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882782.81246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882782.81320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882782.81415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882782.81421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882782.81844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882784.14305: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "s<<< 28173 1726882784.14353: stdout chunk (state=3): >>>tatic", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alia<<< 28173 1726882784.14358: stdout chunk (state=3): >>>s", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "<<< 28173 1726882784.14367: stdout chunk (state=3): >>>source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28173 1726882784.15562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882784.15698: stderr chunk (state=3): >>><<< 28173 1726882784.15702: stdout chunk (state=3): >>><<< 28173 1726882784.15900: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882784.16936: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882782.5949605-29746-99081874755898/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882784.16952: _low_level_execute_command(): starting 28173 1726882784.16958: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882782.5949605-29746-99081874755898/ > /dev/null 2>&1 && sleep 0' 28173 1726882784.17717: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882784.17720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882784.17733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882784.17743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882784.17796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882784.17801: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882784.17841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882784.17844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882784.17855: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882784.17858: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882784.17876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882784.17886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882784.17909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882784.17914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882784.17931: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882784.17948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882784.18035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882784.18060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882784.18065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882784.18196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882784.20020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882784.20108: stderr chunk (state=3): >>><<< 28173 1726882784.20127: stdout chunk (state=3): >>><<< 28173 1726882784.20132: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882784.20141: handler run complete 28173 1726882784.20290: variable 'ansible_facts' from source: unknown 28173 1726882784.20455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882784.20876: variable 'ansible_facts' from source: unknown 28173 1726882784.21067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882784.22053: attempt loop complete, returning result 28173 1726882784.22055: _execute() done 28173 1726882784.22058: dumping result to json 28173 1726882784.22059: done dumping result, returning 28173 1726882784.22061: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-926c-8928-0000000007d4] 28173 1726882784.22064: sending task result for task 0e448fcc-3ce9-926c-8928-0000000007d4 28173 1726882784.22476: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000007d4 28173 1726882784.22480: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882784.22538: no more pending results, returning what we have 28173 1726882784.22540: results queue empty 28173 1726882784.22541: checking for any_errors_fatal 28173 1726882784.22545: done checking for any_errors_fatal 28173 1726882784.22545: checking for max_fail_percentage 28173 1726882784.22547: done checking for max_fail_percentage 28173 1726882784.22548: checking to see if all hosts have failed and the running result is not ok 28173 1726882784.22548: done checking to see if all hosts have failed 28173 1726882784.22549: getting the remaining hosts for this loop 28173 1726882784.22550: done getting the remaining hosts for this loop 28173 1726882784.22553: getting the next task for host managed_node2 28173 1726882784.22558: done getting next task for host managed_node2 28173 1726882784.22562: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28173 1726882784.22567: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882784.22576: getting variables 28173 1726882784.22578: in VariableManager get_vars() 28173 1726882784.22607: Calling all_inventory to load vars for managed_node2 28173 1726882784.22610: Calling groups_inventory to load vars for managed_node2 28173 1726882784.22612: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882784.22621: Calling all_plugins_play to load vars for managed_node2 28173 1726882784.22623: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882784.22626: Calling groups_plugins_play to load vars for managed_node2 28173 1726882784.24509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882784.27932: done with get_vars() 28173 1726882784.27970: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:39:44 -0400 (0:00:01.829) 0:00:37.447 ****** 28173 1726882784.28300: entering _queue_task() for managed_node2/package_facts 28173 1726882784.30406: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28173 1726882784.30443: in run() - task 0e448fcc-3ce9-926c-8928-0000000007d5 28173 1726882784.30447: variable 'ansible_search_path' from source: unknown 28173 1726882784.30449: variable 'ansible_search_path' from source: unknown 28173 1726882784.30452: calling self._execute() 28173 1726882784.30436: worker is 1 (out of 1 available) 28173 1726882784.30465: exiting _queue_task() for managed_node2/package_facts 28173 1726882784.30473: done queuing things up, now waiting for results queue to drain 28173 1726882784.30475: waiting for pending results... 28173 1726882784.31057: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882784.31065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882784.31079: variable 'omit' from source: magic vars 28173 1726882784.33392: variable 'ansible_distribution_major_version' from source: facts 28173 1726882784.33418: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882784.33431: variable 'omit' from source: magic vars 28173 1726882784.33517: variable 'omit' from source: magic vars 28173 1726882784.33580: variable 'omit' from source: magic vars 28173 1726882784.33643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882784.33699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882784.33727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882784.33754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882784.33788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882784.33855: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882784.33873: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882784.33897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882784.34043: Set connection var ansible_pipelining to False 28173 1726882784.34047: Set connection var ansible_shell_type to sh 28173 1726882784.34057: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882784.34076: Set connection var ansible_timeout to 10 28173 1726882784.34087: Set connection var ansible_shell_executable to /bin/sh 28173 1726882784.34095: Set connection var ansible_connection to ssh 28173 1726882784.34148: variable 'ansible_shell_executable' from source: unknown 28173 1726882784.34154: variable 'ansible_connection' from source: unknown 28173 1726882784.34161: variable 'ansible_module_compression' from source: unknown 28173 1726882784.34168: variable 'ansible_shell_type' from source: unknown 28173 1726882784.34172: variable 'ansible_shell_executable' from source: unknown 28173 1726882784.34177: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882784.34179: variable 'ansible_pipelining' from source: unknown 28173 1726882784.34186: variable 'ansible_timeout' from source: unknown 28173 1726882784.34189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882784.34473: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882784.34483: variable 'omit' from source: magic vars 28173 1726882784.34486: starting attempt loop 28173 1726882784.34494: running the handler 28173 1726882784.34502: _low_level_execute_command(): starting 28173 1726882784.34509: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882784.35004: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882784.35021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882784.35041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882784.35055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882784.35107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882784.35112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882784.35233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882784.36894: stdout chunk (state=3): >>>/root <<< 28173 1726882784.37265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882784.37269: stderr chunk (state=3): >>><<< 28173 1726882784.37271: stdout chunk (state=3): >>><<< 28173 1726882784.37274: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882784.37277: _low_level_execute_command(): starting 28173 1726882784.37279: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882784.3716795-29804-227203688209783 `" && echo ansible-tmp-1726882784.3716795-29804-227203688209783="` echo /root/.ansible/tmp/ansible-tmp-1726882784.3716795-29804-227203688209783 `" ) && sleep 0' 28173 1726882784.37873: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882784.37883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882784.37924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882784.37928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882784.37937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882784.37989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882784.38002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882784.38134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882784.40042: stdout chunk (state=3): >>>ansible-tmp-1726882784.3716795-29804-227203688209783=/root/.ansible/tmp/ansible-tmp-1726882784.3716795-29804-227203688209783 <<< 28173 1726882784.40237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882784.40240: stdout chunk (state=3): >>><<< 28173 1726882784.40242: stderr chunk (state=3): >>><<< 28173 1726882784.40477: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882784.3716795-29804-227203688209783=/root/.ansible/tmp/ansible-tmp-1726882784.3716795-29804-227203688209783 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882784.40482: variable 'ansible_module_compression' from source: unknown 28173 1726882784.40484: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28173 1726882784.40486: variable 'ansible_facts' from source: unknown 28173 1726882784.40658: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882784.3716795-29804-227203688209783/AnsiballZ_package_facts.py 28173 1726882784.40851: Sending initial data 28173 1726882784.40854: Sent initial data (162 bytes) 28173 1726882784.42031: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882784.42046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882784.42061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882784.42084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882784.42148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882784.42160: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882784.42178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882784.42197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882784.42209: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882784.42230: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882784.42248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882784.42262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882784.42284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882784.42298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882784.42310: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882784.42325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882784.42412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882784.42438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882784.42467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882784.42606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882784.44435: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882784.44523: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882784.44629: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmprp4uko6x /root/.ansible/tmp/ansible-tmp-1726882784.3716795-29804-227203688209783/AnsiballZ_package_facts.py <<< 28173 1726882784.44723: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882784.47143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882784.47244: stderr chunk (state=3): >>><<< 28173 1726882784.47248: stdout chunk (state=3): >>><<< 28173 1726882784.47262: done transferring module to remote 28173 1726882784.47275: _low_level_execute_command(): starting 28173 1726882784.47281: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882784.3716795-29804-227203688209783/ /root/.ansible/tmp/ansible-tmp-1726882784.3716795-29804-227203688209783/AnsiballZ_package_facts.py && sleep 0' 28173 1726882784.47727: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882784.47733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882784.47744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882784.47778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882784.47785: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28173 1726882784.47788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882784.47797: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882784.47802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882784.47811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882784.47819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882784.47828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882784.47830: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882784.47840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882784.47892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882784.47915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882784.47922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882784.48021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882784.49856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882784.49909: stderr chunk (state=3): >>><<< 28173 1726882784.49911: stdout chunk (state=3): >>><<< 28173 1726882784.49971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882784.49977: _low_level_execute_command(): starting 28173 1726882784.49979: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882784.3716795-29804-227203688209783/AnsiballZ_package_facts.py && sleep 0' 28173 1726882784.50359: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882784.50384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882784.50388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882784.50390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882784.50432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882784.50435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882784.50437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882784.50439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882784.50493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882784.50496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882784.50605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882784.96804: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null,<<< 28173 1726882784.96831: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "<<< 28173 1726882784.96840: stdout chunk (state=3): >>>rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": <<< 28173 1726882784.96880: stdout chunk (state=3): >>>"7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1<<< 28173 1726882784.96908: stdout chunk (state=3): >>>.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", <<< 28173 1726882784.96932: stdout chunk (state=3): >>>"release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch<<< 28173 1726882784.96946: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source"<<< 28173 1726882784.96950: stdout chunk (state=3): >>>: "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "ar<<< 28173 1726882784.96957: stdout chunk (state=3): >>>ch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", <<< 28173 1726882784.96964: stdout chunk (state=3): >>>"release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "<<< 28173 1726882784.96988: stdout chunk (state=3): >>>version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "re<<< 28173 1726882784.97020: stdout chunk (state=3): >>>lease": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", <<< 28173 1726882784.97033: stdout chunk (state=3): >>>"source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28173 1726882784.98477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882784.98504: stderr chunk (state=3): >>>Shared connection to 10.31.11.158 closed. <<< 28173 1726882784.98566: stderr chunk (state=3): >>><<< 28173 1726882784.98573: stdout chunk (state=3): >>><<< 28173 1726882784.98637: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882785.03997: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882784.3716795-29804-227203688209783/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882785.04044: _low_level_execute_command(): starting 28173 1726882785.04071: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882784.3716795-29804-227203688209783/ > /dev/null 2>&1 && sleep 0' 28173 1726882785.04897: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882785.04905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882785.04942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882785.04947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 28173 1726882785.04956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882785.04961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882785.04984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882785.04987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882785.05038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882785.05050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882785.05174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882785.07054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882785.07117: stderr chunk (state=3): >>><<< 28173 1726882785.07120: stdout chunk (state=3): >>><<< 28173 1726882785.07380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882785.07383: handler run complete 28173 1726882785.08639: variable 'ansible_facts' from source: unknown 28173 1726882785.09298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882785.10506: variable 'ansible_facts' from source: unknown 28173 1726882785.10774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882785.11226: attempt loop complete, returning result 28173 1726882785.11238: _execute() done 28173 1726882785.11241: dumping result to json 28173 1726882785.11393: done dumping result, returning 28173 1726882785.11397: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-926c-8928-0000000007d5] 28173 1726882785.11400: sending task result for task 0e448fcc-3ce9-926c-8928-0000000007d5 28173 1726882785.20405: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000007d5 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882785.20443: WORKER PROCESS EXITING 28173 1726882785.20452: no more pending results, returning what we have 28173 1726882785.20454: results queue empty 28173 1726882785.20455: checking for any_errors_fatal 28173 1726882785.20458: done checking for any_errors_fatal 28173 1726882785.20458: checking for max_fail_percentage 28173 1726882785.20459: done checking for max_fail_percentage 28173 1726882785.20460: checking to see if all hosts have failed and the running result is not ok 28173 1726882785.20460: done checking to see if all hosts have failed 28173 1726882785.20461: getting the remaining hosts for this loop 28173 1726882785.20462: done getting the remaining hosts for this loop 28173 1726882785.20466: getting the next task for host managed_node2 28173 1726882785.20470: done getting next task for host managed_node2 28173 1726882785.20472: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28173 1726882785.20473: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882785.20478: getting variables 28173 1726882785.20479: in VariableManager get_vars() 28173 1726882785.20494: Calling all_inventory to load vars for managed_node2 28173 1726882785.20496: Calling groups_inventory to load vars for managed_node2 28173 1726882785.20497: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882785.20501: Calling all_plugins_play to load vars for managed_node2 28173 1726882785.20502: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882785.20504: Calling groups_plugins_play to load vars for managed_node2 28173 1726882785.21240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882785.23886: done with get_vars() 28173 1726882785.23912: done getting variables 28173 1726882785.23959: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:39:45 -0400 (0:00:00.956) 0:00:38.404 ****** 28173 1726882785.23988: entering _queue_task() for managed_node2/debug 28173 1726882785.24392: worker is 1 (out of 1 available) 28173 1726882785.24405: exiting _queue_task() for managed_node2/debug 28173 1726882785.24418: done queuing things up, now waiting for results queue to drain 28173 1726882785.24420: waiting for pending results... 28173 1726882785.24799: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28173 1726882785.24931: in run() - task 0e448fcc-3ce9-926c-8928-0000000000d8 28173 1726882785.24952: variable 'ansible_search_path' from source: unknown 28173 1726882785.24959: variable 'ansible_search_path' from source: unknown 28173 1726882785.25007: calling self._execute() 28173 1726882785.25128: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882785.25143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882785.25158: variable 'omit' from source: magic vars 28173 1726882785.25558: variable 'ansible_distribution_major_version' from source: facts 28173 1726882785.25583: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882785.25596: variable 'omit' from source: magic vars 28173 1726882785.25643: variable 'omit' from source: magic vars 28173 1726882785.25749: variable 'network_provider' from source: set_fact 28173 1726882785.25773: variable 'omit' from source: magic vars 28173 1726882785.25820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882785.25862: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882785.25890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882785.25919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882785.25937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882785.25975: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882785.25982: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882785.25989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882785.26096: Set connection var ansible_pipelining to False 28173 1726882785.26105: Set connection var ansible_shell_type to sh 28173 1726882785.26124: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882785.26136: Set connection var ansible_timeout to 10 28173 1726882785.26146: Set connection var ansible_shell_executable to /bin/sh 28173 1726882785.26155: Set connection var ansible_connection to ssh 28173 1726882785.26186: variable 'ansible_shell_executable' from source: unknown 28173 1726882785.26194: variable 'ansible_connection' from source: unknown 28173 1726882785.26200: variable 'ansible_module_compression' from source: unknown 28173 1726882785.26206: variable 'ansible_shell_type' from source: unknown 28173 1726882785.26212: variable 'ansible_shell_executable' from source: unknown 28173 1726882785.26220: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882785.26230: variable 'ansible_pipelining' from source: unknown 28173 1726882785.26236: variable 'ansible_timeout' from source: unknown 28173 1726882785.26243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882785.26394: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882785.26413: variable 'omit' from source: magic vars 28173 1726882785.26422: starting attempt loop 28173 1726882785.26459: running the handler 28173 1726882785.26514: handler run complete 28173 1726882785.26585: attempt loop complete, returning result 28173 1726882785.26592: _execute() done 28173 1726882785.26597: dumping result to json 28173 1726882785.26603: done dumping result, returning 28173 1726882785.27301: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-926c-8928-0000000000d8] 28173 1726882785.27314: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000d8 ok: [managed_node2] => {} MSG: Using network provider: nm 28173 1726882785.27467: no more pending results, returning what we have 28173 1726882785.27470: results queue empty 28173 1726882785.27471: checking for any_errors_fatal 28173 1726882785.27485: done checking for any_errors_fatal 28173 1726882785.27486: checking for max_fail_percentage 28173 1726882785.27488: done checking for max_fail_percentage 28173 1726882785.27489: checking to see if all hosts have failed and the running result is not ok 28173 1726882785.27490: done checking to see if all hosts have failed 28173 1726882785.27491: getting the remaining hosts for this loop 28173 1726882785.27493: done getting the remaining hosts for this loop 28173 1726882785.27496: getting the next task for host managed_node2 28173 1726882785.27502: done getting next task for host managed_node2 28173 1726882785.27506: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28173 1726882785.27508: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882785.27520: getting variables 28173 1726882785.27522: in VariableManager get_vars() 28173 1726882785.27559: Calling all_inventory to load vars for managed_node2 28173 1726882785.27562: Calling groups_inventory to load vars for managed_node2 28173 1726882785.27567: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882785.27578: Calling all_plugins_play to load vars for managed_node2 28173 1726882785.27580: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882785.27583: Calling groups_plugins_play to load vars for managed_node2 28173 1726882785.28605: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000d8 28173 1726882785.28609: WORKER PROCESS EXITING 28173 1726882785.30294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882785.32011: done with get_vars() 28173 1726882785.32034: done getting variables 28173 1726882785.32095: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:39:45 -0400 (0:00:00.081) 0:00:38.485 ****** 28173 1726882785.32133: entering _queue_task() for managed_node2/fail 28173 1726882785.32446: worker is 1 (out of 1 available) 28173 1726882785.32457: exiting _queue_task() for managed_node2/fail 28173 1726882785.32472: done queuing things up, now waiting for results queue to drain 28173 1726882785.32473: waiting for pending results... 28173 1726882785.32874: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28173 1726882785.33000: in run() - task 0e448fcc-3ce9-926c-8928-0000000000d9 28173 1726882785.33021: variable 'ansible_search_path' from source: unknown 28173 1726882785.33030: variable 'ansible_search_path' from source: unknown 28173 1726882785.33076: calling self._execute() 28173 1726882785.33186: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882785.33197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882785.33218: variable 'omit' from source: magic vars 28173 1726882785.33984: variable 'ansible_distribution_major_version' from source: facts 28173 1726882785.34003: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882785.34227: variable 'network_state' from source: role '' defaults 28173 1726882785.34310: Evaluated conditional (network_state != {}): False 28173 1726882785.34318: when evaluation is False, skipping this task 28173 1726882785.34324: _execute() done 28173 1726882785.34331: dumping result to json 28173 1726882785.34342: done dumping result, returning 28173 1726882785.34355: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-926c-8928-0000000000d9] 28173 1726882785.34371: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000d9 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882785.34560: no more pending results, returning what we have 28173 1726882785.34566: results queue empty 28173 1726882785.34567: checking for any_errors_fatal 28173 1726882785.34579: done checking for any_errors_fatal 28173 1726882785.34580: checking for max_fail_percentage 28173 1726882785.34582: done checking for max_fail_percentage 28173 1726882785.34583: checking to see if all hosts have failed and the running result is not ok 28173 1726882785.34584: done checking to see if all hosts have failed 28173 1726882785.34585: getting the remaining hosts for this loop 28173 1726882785.34587: done getting the remaining hosts for this loop 28173 1726882785.34590: getting the next task for host managed_node2 28173 1726882785.34596: done getting next task for host managed_node2 28173 1726882785.34601: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28173 1726882785.34605: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882785.34620: getting variables 28173 1726882785.34622: in VariableManager get_vars() 28173 1726882785.34662: Calling all_inventory to load vars for managed_node2 28173 1726882785.34666: Calling groups_inventory to load vars for managed_node2 28173 1726882785.34669: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882785.34685: Calling all_plugins_play to load vars for managed_node2 28173 1726882785.34689: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882785.34692: Calling groups_plugins_play to load vars for managed_node2 28173 1726882785.35720: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000d9 28173 1726882785.35724: WORKER PROCESS EXITING 28173 1726882785.36810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882785.39062: done with get_vars() 28173 1726882785.39089: done getting variables 28173 1726882785.39151: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:39:45 -0400 (0:00:00.070) 0:00:38.556 ****** 28173 1726882785.39186: entering _queue_task() for managed_node2/fail 28173 1726882785.39631: worker is 1 (out of 1 available) 28173 1726882785.39686: exiting _queue_task() for managed_node2/fail 28173 1726882785.39698: done queuing things up, now waiting for results queue to drain 28173 1726882785.39699: waiting for pending results... 28173 1726882785.41788: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28173 1726882785.41904: in run() - task 0e448fcc-3ce9-926c-8928-0000000000da 28173 1726882785.42292: variable 'ansible_search_path' from source: unknown 28173 1726882785.42322: variable 'ansible_search_path' from source: unknown 28173 1726882785.42976: calling self._execute() 28173 1726882785.43367: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882785.43604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882785.43637: variable 'omit' from source: magic vars 28173 1726882785.44703: variable 'ansible_distribution_major_version' from source: facts 28173 1726882785.44761: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882785.45498: variable 'network_state' from source: role '' defaults 28173 1726882785.45518: Evaluated conditional (network_state != {}): False 28173 1726882785.45529: when evaluation is False, skipping this task 28173 1726882785.45541: _execute() done 28173 1726882785.45573: dumping result to json 28173 1726882785.45583: done dumping result, returning 28173 1726882785.45594: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-926c-8928-0000000000da] 28173 1726882785.45606: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000da skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882785.46070: no more pending results, returning what we have 28173 1726882785.46078: results queue empty 28173 1726882785.46079: checking for any_errors_fatal 28173 1726882785.46088: done checking for any_errors_fatal 28173 1726882785.46089: checking for max_fail_percentage 28173 1726882785.46091: done checking for max_fail_percentage 28173 1726882785.46092: checking to see if all hosts have failed and the running result is not ok 28173 1726882785.46093: done checking to see if all hosts have failed 28173 1726882785.46094: getting the remaining hosts for this loop 28173 1726882785.46095: done getting the remaining hosts for this loop 28173 1726882785.46099: getting the next task for host managed_node2 28173 1726882785.46106: done getting next task for host managed_node2 28173 1726882785.46110: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28173 1726882785.46113: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882785.46128: getting variables 28173 1726882785.46131: in VariableManager get_vars() 28173 1726882785.46751: Calling all_inventory to load vars for managed_node2 28173 1726882785.46754: Calling groups_inventory to load vars for managed_node2 28173 1726882785.46760: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882785.46852: Calling all_plugins_play to load vars for managed_node2 28173 1726882785.46856: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882785.46859: Calling groups_plugins_play to load vars for managed_node2 28173 1726882785.47601: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000da 28173 1726882785.47605: WORKER PROCESS EXITING 28173 1726882785.51287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882785.55699: done with get_vars() 28173 1726882785.55727: done getting variables 28173 1726882785.55800: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:39:45 -0400 (0:00:00.166) 0:00:38.722 ****** 28173 1726882785.55832: entering _queue_task() for managed_node2/fail 28173 1726882785.56176: worker is 1 (out of 1 available) 28173 1726882785.56188: exiting _queue_task() for managed_node2/fail 28173 1726882785.56206: done queuing things up, now waiting for results queue to drain 28173 1726882785.56208: waiting for pending results... 28173 1726882785.56524: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28173 1726882785.56659: in run() - task 0e448fcc-3ce9-926c-8928-0000000000db 28173 1726882785.56683: variable 'ansible_search_path' from source: unknown 28173 1726882785.56691: variable 'ansible_search_path' from source: unknown 28173 1726882785.56735: calling self._execute() 28173 1726882785.56983: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882785.57052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882785.57074: variable 'omit' from source: magic vars 28173 1726882785.57949: variable 'ansible_distribution_major_version' from source: facts 28173 1726882785.57976: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882785.58348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882785.61712: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882785.61792: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882785.62077: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882785.62116: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882785.62199: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882785.62358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882785.62538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882785.62615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882785.62756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882785.62780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882785.62997: variable 'ansible_distribution_major_version' from source: facts 28173 1726882785.63059: Evaluated conditional (ansible_distribution_major_version | int > 9): False 28173 1726882785.63070: when evaluation is False, skipping this task 28173 1726882785.63078: _execute() done 28173 1726882785.63150: dumping result to json 28173 1726882785.63167: done dumping result, returning 28173 1726882785.63182: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-926c-8928-0000000000db] 28173 1726882785.63194: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000db skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 28173 1726882785.63351: no more pending results, returning what we have 28173 1726882785.63355: results queue empty 28173 1726882785.63355: checking for any_errors_fatal 28173 1726882785.63362: done checking for any_errors_fatal 28173 1726882785.63363: checking for max_fail_percentage 28173 1726882785.63368: done checking for max_fail_percentage 28173 1726882785.63369: checking to see if all hosts have failed and the running result is not ok 28173 1726882785.63370: done checking to see if all hosts have failed 28173 1726882785.63371: getting the remaining hosts for this loop 28173 1726882785.63372: done getting the remaining hosts for this loop 28173 1726882785.63376: getting the next task for host managed_node2 28173 1726882785.63382: done getting next task for host managed_node2 28173 1726882785.63386: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28173 1726882785.63388: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882785.63400: getting variables 28173 1726882785.63403: in VariableManager get_vars() 28173 1726882785.63445: Calling all_inventory to load vars for managed_node2 28173 1726882785.63448: Calling groups_inventory to load vars for managed_node2 28173 1726882785.63450: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882785.63461: Calling all_plugins_play to load vars for managed_node2 28173 1726882785.63467: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882785.63471: Calling groups_plugins_play to load vars for managed_node2 28173 1726882785.64525: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000db 28173 1726882785.64528: WORKER PROCESS EXITING 28173 1726882785.66109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882785.68119: done with get_vars() 28173 1726882785.68140: done getting variables 28173 1726882785.68193: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:39:45 -0400 (0:00:00.124) 0:00:38.846 ****** 28173 1726882785.68233: entering _queue_task() for managed_node2/dnf 28173 1726882785.68624: worker is 1 (out of 1 available) 28173 1726882785.68638: exiting _queue_task() for managed_node2/dnf 28173 1726882785.68650: done queuing things up, now waiting for results queue to drain 28173 1726882785.68651: waiting for pending results... 28173 1726882785.68885: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28173 1726882785.69009: in run() - task 0e448fcc-3ce9-926c-8928-0000000000dc 28173 1726882785.69030: variable 'ansible_search_path' from source: unknown 28173 1726882785.69039: variable 'ansible_search_path' from source: unknown 28173 1726882785.69087: calling self._execute() 28173 1726882785.69193: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882785.69204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882785.69225: variable 'omit' from source: magic vars 28173 1726882785.69704: variable 'ansible_distribution_major_version' from source: facts 28173 1726882785.69714: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882785.69853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882785.73573: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882785.73650: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882785.73829: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882785.73870: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882785.73930: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882785.74342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882785.74401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882785.74587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882785.74699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882785.74782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882785.75153: variable 'ansible_distribution' from source: facts 28173 1726882785.75156: variable 'ansible_distribution_major_version' from source: facts 28173 1726882785.75191: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28173 1726882785.75343: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882785.75524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882785.75553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882785.75590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882785.75645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882785.75659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882785.75720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882785.75752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882785.75791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882785.75844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882785.75870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882785.75920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882785.75958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882785.75992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882785.76041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882785.76074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882785.76287: variable 'network_connections' from source: play vars 28173 1726882785.76303: variable 'profile' from source: play vars 28173 1726882785.76388: variable 'profile' from source: play vars 28173 1726882785.76406: variable 'interface' from source: set_fact 28173 1726882785.76479: variable 'interface' from source: set_fact 28173 1726882785.76571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882785.76777: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882785.76829: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882785.76873: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882785.76912: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882785.76973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882785.77034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882785.77086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882785.77143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882785.77225: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882785.77820: variable 'network_connections' from source: play vars 28173 1726882785.77830: variable 'profile' from source: play vars 28173 1726882785.77905: variable 'profile' from source: play vars 28173 1726882785.77921: variable 'interface' from source: set_fact 28173 1726882785.78272: variable 'interface' from source: set_fact 28173 1726882785.78310: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28173 1726882785.78323: when evaluation is False, skipping this task 28173 1726882785.78331: _execute() done 28173 1726882785.78339: dumping result to json 28173 1726882785.78346: done dumping result, returning 28173 1726882785.78358: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-0000000000dc] 28173 1726882785.78375: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000dc skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28173 1726882785.78541: no more pending results, returning what we have 28173 1726882785.78544: results queue empty 28173 1726882785.78545: checking for any_errors_fatal 28173 1726882785.78552: done checking for any_errors_fatal 28173 1726882785.78553: checking for max_fail_percentage 28173 1726882785.78554: done checking for max_fail_percentage 28173 1726882785.78555: checking to see if all hosts have failed and the running result is not ok 28173 1726882785.78556: done checking to see if all hosts have failed 28173 1726882785.78556: getting the remaining hosts for this loop 28173 1726882785.78558: done getting the remaining hosts for this loop 28173 1726882785.78561: getting the next task for host managed_node2 28173 1726882785.78569: done getting next task for host managed_node2 28173 1726882785.78573: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28173 1726882785.78575: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882785.78591: getting variables 28173 1726882785.78593: in VariableManager get_vars() 28173 1726882785.78633: Calling all_inventory to load vars for managed_node2 28173 1726882785.78635: Calling groups_inventory to load vars for managed_node2 28173 1726882785.78637: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882785.78649: Calling all_plugins_play to load vars for managed_node2 28173 1726882785.78652: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882785.78656: Calling groups_plugins_play to load vars for managed_node2 28173 1726882785.79179: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000dc 28173 1726882785.79183: WORKER PROCESS EXITING 28173 1726882785.81244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882785.86265: done with get_vars() 28173 1726882785.86295: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28173 1726882785.86497: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:39:45 -0400 (0:00:00.182) 0:00:39.029 ****** 28173 1726882785.86525: entering _queue_task() for managed_node2/yum 28173 1726882785.87835: worker is 1 (out of 1 available) 28173 1726882785.87846: exiting _queue_task() for managed_node2/yum 28173 1726882785.87858: done queuing things up, now waiting for results queue to drain 28173 1726882785.87859: waiting for pending results... 28173 1726882785.88359: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28173 1726882785.88537: in run() - task 0e448fcc-3ce9-926c-8928-0000000000dd 28173 1726882785.88559: variable 'ansible_search_path' from source: unknown 28173 1726882785.88573: variable 'ansible_search_path' from source: unknown 28173 1726882785.88655: calling self._execute() 28173 1726882785.88876: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882785.88887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882785.88900: variable 'omit' from source: magic vars 28173 1726882785.89309: variable 'ansible_distribution_major_version' from source: facts 28173 1726882785.89327: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882785.89534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882785.92134: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882785.92209: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882785.92260: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882785.92304: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882785.92491: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882785.92731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882785.92826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882785.92988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882785.93046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882785.93099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882785.93342: variable 'ansible_distribution_major_version' from source: facts 28173 1726882785.93365: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28173 1726882785.93378: when evaluation is False, skipping this task 28173 1726882785.93395: _execute() done 28173 1726882785.93402: dumping result to json 28173 1726882785.93409: done dumping result, returning 28173 1726882785.93419: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-0000000000dd] 28173 1726882785.93430: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000dd skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28173 1726882785.93620: no more pending results, returning what we have 28173 1726882785.93624: results queue empty 28173 1726882785.93625: checking for any_errors_fatal 28173 1726882785.93634: done checking for any_errors_fatal 28173 1726882785.93635: checking for max_fail_percentage 28173 1726882785.93637: done checking for max_fail_percentage 28173 1726882785.93638: checking to see if all hosts have failed and the running result is not ok 28173 1726882785.93639: done checking to see if all hosts have failed 28173 1726882785.93639: getting the remaining hosts for this loop 28173 1726882785.93644: done getting the remaining hosts for this loop 28173 1726882785.93652: getting the next task for host managed_node2 28173 1726882785.93662: done getting next task for host managed_node2 28173 1726882785.93671: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28173 1726882785.93677: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882785.93693: getting variables 28173 1726882785.93695: in VariableManager get_vars() 28173 1726882785.93749: Calling all_inventory to load vars for managed_node2 28173 1726882785.93752: Calling groups_inventory to load vars for managed_node2 28173 1726882785.93754: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882785.93769: Calling all_plugins_play to load vars for managed_node2 28173 1726882785.93773: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882785.93776: Calling groups_plugins_play to load vars for managed_node2 28173 1726882785.94859: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000dd 28173 1726882785.94867: WORKER PROCESS EXITING 28173 1726882785.96249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882785.98272: done with get_vars() 28173 1726882785.98320: done getting variables 28173 1726882785.98416: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:39:45 -0400 (0:00:00.119) 0:00:39.148 ****** 28173 1726882785.98446: entering _queue_task() for managed_node2/fail 28173 1726882785.98776: worker is 1 (out of 1 available) 28173 1726882785.98790: exiting _queue_task() for managed_node2/fail 28173 1726882785.98801: done queuing things up, now waiting for results queue to drain 28173 1726882785.98802: waiting for pending results... 28173 1726882785.99116: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28173 1726882785.99238: in run() - task 0e448fcc-3ce9-926c-8928-0000000000de 28173 1726882785.99268: variable 'ansible_search_path' from source: unknown 28173 1726882785.99283: variable 'ansible_search_path' from source: unknown 28173 1726882785.99324: calling self._execute() 28173 1726882785.99435: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882785.99447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882785.99464: variable 'omit' from source: magic vars 28173 1726882785.99948: variable 'ansible_distribution_major_version' from source: facts 28173 1726882785.99972: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882786.00171: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882786.00417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882786.03777: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882786.03860: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882786.03994: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882786.04093: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882786.04173: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882786.04256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.04307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.04338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.04425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.04520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.04649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.04814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.04887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.05011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.05085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.05230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.05291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.05329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.05378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.05402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.05574: variable 'network_connections' from source: play vars 28173 1726882786.05593: variable 'profile' from source: play vars 28173 1726882786.05674: variable 'profile' from source: play vars 28173 1726882786.05690: variable 'interface' from source: set_fact 28173 1726882786.06734: variable 'interface' from source: set_fact 28173 1726882786.06836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882786.07020: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882786.07066: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882786.07107: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882786.07147: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882786.07286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882786.07318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882786.07388: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.07497: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882786.07648: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882786.08256: variable 'network_connections' from source: play vars 28173 1726882786.08260: variable 'profile' from source: play vars 28173 1726882786.08371: variable 'profile' from source: play vars 28173 1726882786.08390: variable 'interface' from source: set_fact 28173 1726882786.08471: variable 'interface' from source: set_fact 28173 1726882786.08506: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28173 1726882786.08536: when evaluation is False, skipping this task 28173 1726882786.08543: _execute() done 28173 1726882786.08550: dumping result to json 28173 1726882786.08562: done dumping result, returning 28173 1726882786.08580: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-0000000000de] 28173 1726882786.08597: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000de skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28173 1726882786.08764: no more pending results, returning what we have 28173 1726882786.08771: results queue empty 28173 1726882786.08772: checking for any_errors_fatal 28173 1726882786.08779: done checking for any_errors_fatal 28173 1726882786.08780: checking for max_fail_percentage 28173 1726882786.08781: done checking for max_fail_percentage 28173 1726882786.08782: checking to see if all hosts have failed and the running result is not ok 28173 1726882786.08783: done checking to see if all hosts have failed 28173 1726882786.08784: getting the remaining hosts for this loop 28173 1726882786.08786: done getting the remaining hosts for this loop 28173 1726882786.08789: getting the next task for host managed_node2 28173 1726882786.08796: done getting next task for host managed_node2 28173 1726882786.08800: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28173 1726882786.08803: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882786.08815: getting variables 28173 1726882786.08818: in VariableManager get_vars() 28173 1726882786.08858: Calling all_inventory to load vars for managed_node2 28173 1726882786.08861: Calling groups_inventory to load vars for managed_node2 28173 1726882786.08865: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882786.08884: Calling all_plugins_play to load vars for managed_node2 28173 1726882786.08888: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882786.08891: Calling groups_plugins_play to load vars for managed_node2 28173 1726882786.09936: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000de 28173 1726882786.09939: WORKER PROCESS EXITING 28173 1726882786.10883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882786.14522: done with get_vars() 28173 1726882786.14547: done getting variables 28173 1726882786.14621: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:39:46 -0400 (0:00:00.162) 0:00:39.310 ****** 28173 1726882786.14655: entering _queue_task() for managed_node2/package 28173 1726882786.14987: worker is 1 (out of 1 available) 28173 1726882786.15000: exiting _queue_task() for managed_node2/package 28173 1726882786.15017: done queuing things up, now waiting for results queue to drain 28173 1726882786.15018: waiting for pending results... 28173 1726882786.15339: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28173 1726882786.15482: in run() - task 0e448fcc-3ce9-926c-8928-0000000000df 28173 1726882786.15502: variable 'ansible_search_path' from source: unknown 28173 1726882786.15509: variable 'ansible_search_path' from source: unknown 28173 1726882786.15549: calling self._execute() 28173 1726882786.15662: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882786.15688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882786.15711: variable 'omit' from source: magic vars 28173 1726882786.16154: variable 'ansible_distribution_major_version' from source: facts 28173 1726882786.16185: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882786.16414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882786.16725: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882786.16784: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882786.16828: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882786.16915: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882786.17051: variable 'network_packages' from source: role '' defaults 28173 1726882786.17172: variable '__network_provider_setup' from source: role '' defaults 28173 1726882786.17186: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882786.17260: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882786.17276: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882786.17344: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882786.17537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882786.20972: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882786.21049: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882786.21095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882786.21138: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882786.21174: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882786.21252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.21291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.21322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.21381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.21402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.21449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.21513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.21543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.21601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.21621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.21928: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28173 1726882786.22062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.22097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.22138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.22187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.22207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.22321: variable 'ansible_python' from source: facts 28173 1726882786.22361: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28173 1726882786.22573: variable '__network_wpa_supplicant_required' from source: role '' defaults 28173 1726882786.22869: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28173 1726882786.24105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.24139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.24178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.24223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.24242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.24303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.24339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.24383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.24428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.24447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.24616: variable 'network_connections' from source: play vars 28173 1726882786.24627: variable 'profile' from source: play vars 28173 1726882786.24746: variable 'profile' from source: play vars 28173 1726882786.24759: variable 'interface' from source: set_fact 28173 1726882786.24844: variable 'interface' from source: set_fact 28173 1726882786.24949: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882786.24986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882786.25027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.25087: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882786.25146: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882786.25502: variable 'network_connections' from source: play vars 28173 1726882786.25512: variable 'profile' from source: play vars 28173 1726882786.25624: variable 'profile' from source: play vars 28173 1726882786.25637: variable 'interface' from source: set_fact 28173 1726882786.25720: variable 'interface' from source: set_fact 28173 1726882786.25756: variable '__network_packages_default_wireless' from source: role '' defaults 28173 1726882786.25849: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882786.26199: variable 'network_connections' from source: play vars 28173 1726882786.26211: variable 'profile' from source: play vars 28173 1726882786.26286: variable 'profile' from source: play vars 28173 1726882786.26294: variable 'interface' from source: set_fact 28173 1726882786.26407: variable 'interface' from source: set_fact 28173 1726882786.26441: variable '__network_packages_default_team' from source: role '' defaults 28173 1726882786.26536: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882786.26879: variable 'network_connections' from source: play vars 28173 1726882786.26889: variable 'profile' from source: play vars 28173 1726882786.26953: variable 'profile' from source: play vars 28173 1726882786.26962: variable 'interface' from source: set_fact 28173 1726882786.27084: variable 'interface' from source: set_fact 28173 1726882786.27149: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882786.27224: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882786.27235: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882786.27303: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882786.27543: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28173 1726882786.28077: variable 'network_connections' from source: play vars 28173 1726882786.28087: variable 'profile' from source: play vars 28173 1726882786.28151: variable 'profile' from source: play vars 28173 1726882786.28163: variable 'interface' from source: set_fact 28173 1726882786.28236: variable 'interface' from source: set_fact 28173 1726882786.28250: variable 'ansible_distribution' from source: facts 28173 1726882786.28257: variable '__network_rh_distros' from source: role '' defaults 28173 1726882786.28273: variable 'ansible_distribution_major_version' from source: facts 28173 1726882786.28298: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28173 1726882786.28475: variable 'ansible_distribution' from source: facts 28173 1726882786.28484: variable '__network_rh_distros' from source: role '' defaults 28173 1726882786.28500: variable 'ansible_distribution_major_version' from source: facts 28173 1726882786.28518: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28173 1726882786.28692: variable 'ansible_distribution' from source: facts 28173 1726882786.28700: variable '__network_rh_distros' from source: role '' defaults 28173 1726882786.28711: variable 'ansible_distribution_major_version' from source: facts 28173 1726882786.28754: variable 'network_provider' from source: set_fact 28173 1726882786.28776: variable 'ansible_facts' from source: unknown 28173 1726882786.29595: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28173 1726882786.29602: when evaluation is False, skipping this task 28173 1726882786.29608: _execute() done 28173 1726882786.29614: dumping result to json 28173 1726882786.29621: done dumping result, returning 28173 1726882786.29632: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-926c-8928-0000000000df] 28173 1726882786.29641: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000df skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28173 1726882786.29797: no more pending results, returning what we have 28173 1726882786.29801: results queue empty 28173 1726882786.29802: checking for any_errors_fatal 28173 1726882786.29810: done checking for any_errors_fatal 28173 1726882786.29811: checking for max_fail_percentage 28173 1726882786.29813: done checking for max_fail_percentage 28173 1726882786.29814: checking to see if all hosts have failed and the running result is not ok 28173 1726882786.29815: done checking to see if all hosts have failed 28173 1726882786.29816: getting the remaining hosts for this loop 28173 1726882786.29817: done getting the remaining hosts for this loop 28173 1726882786.29821: getting the next task for host managed_node2 28173 1726882786.29827: done getting next task for host managed_node2 28173 1726882786.29832: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28173 1726882786.29835: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882786.29848: getting variables 28173 1726882786.29850: in VariableManager get_vars() 28173 1726882786.29893: Calling all_inventory to load vars for managed_node2 28173 1726882786.29896: Calling groups_inventory to load vars for managed_node2 28173 1726882786.29899: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882786.29910: Calling all_plugins_play to load vars for managed_node2 28173 1726882786.29919: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882786.29923: Calling groups_plugins_play to load vars for managed_node2 28173 1726882786.30906: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000df 28173 1726882786.30909: WORKER PROCESS EXITING 28173 1726882786.31942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882786.35438: done with get_vars() 28173 1726882786.35468: done getting variables 28173 1726882786.35536: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:39:46 -0400 (0:00:00.209) 0:00:39.520 ****** 28173 1726882786.35569: entering _queue_task() for managed_node2/package 28173 1726882786.36386: worker is 1 (out of 1 available) 28173 1726882786.36399: exiting _queue_task() for managed_node2/package 28173 1726882786.36411: done queuing things up, now waiting for results queue to drain 28173 1726882786.36412: waiting for pending results... 28173 1726882786.37176: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28173 1726882786.37294: in run() - task 0e448fcc-3ce9-926c-8928-0000000000e0 28173 1726882786.37307: variable 'ansible_search_path' from source: unknown 28173 1726882786.37311: variable 'ansible_search_path' from source: unknown 28173 1726882786.37349: calling self._execute() 28173 1726882786.37446: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882786.37456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882786.37470: variable 'omit' from source: magic vars 28173 1726882786.37904: variable 'ansible_distribution_major_version' from source: facts 28173 1726882786.37918: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882786.38048: variable 'network_state' from source: role '' defaults 28173 1726882786.38057: Evaluated conditional (network_state != {}): False 28173 1726882786.38062: when evaluation is False, skipping this task 28173 1726882786.38064: _execute() done 28173 1726882786.38069: dumping result to json 28173 1726882786.38074: done dumping result, returning 28173 1726882786.38077: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-926c-8928-0000000000e0] 28173 1726882786.38085: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e0 28173 1726882786.38304: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e0 28173 1726882786.38311: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882786.38365: no more pending results, returning what we have 28173 1726882786.38376: results queue empty 28173 1726882786.38377: checking for any_errors_fatal 28173 1726882786.38384: done checking for any_errors_fatal 28173 1726882786.38385: checking for max_fail_percentage 28173 1726882786.38387: done checking for max_fail_percentage 28173 1726882786.38388: checking to see if all hosts have failed and the running result is not ok 28173 1726882786.38389: done checking to see if all hosts have failed 28173 1726882786.38390: getting the remaining hosts for this loop 28173 1726882786.38391: done getting the remaining hosts for this loop 28173 1726882786.38396: getting the next task for host managed_node2 28173 1726882786.38402: done getting next task for host managed_node2 28173 1726882786.38407: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28173 1726882786.38410: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882786.38425: getting variables 28173 1726882786.38427: in VariableManager get_vars() 28173 1726882786.38470: Calling all_inventory to load vars for managed_node2 28173 1726882786.38473: Calling groups_inventory to load vars for managed_node2 28173 1726882786.38476: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882786.38491: Calling all_plugins_play to load vars for managed_node2 28173 1726882786.38495: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882786.38498: Calling groups_plugins_play to load vars for managed_node2 28173 1726882786.41288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882786.45033: done with get_vars() 28173 1726882786.45061: done getting variables 28173 1726882786.45134: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:39:46 -0400 (0:00:00.095) 0:00:39.616 ****** 28173 1726882786.45168: entering _queue_task() for managed_node2/package 28173 1726882786.45512: worker is 1 (out of 1 available) 28173 1726882786.45533: exiting _queue_task() for managed_node2/package 28173 1726882786.45701: done queuing things up, now waiting for results queue to drain 28173 1726882786.45703: waiting for pending results... 28173 1726882786.45875: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28173 1726882786.46048: in run() - task 0e448fcc-3ce9-926c-8928-0000000000e1 28173 1726882786.46072: variable 'ansible_search_path' from source: unknown 28173 1726882786.46080: variable 'ansible_search_path' from source: unknown 28173 1726882786.46337: calling self._execute() 28173 1726882786.46439: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882786.46448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882786.46460: variable 'omit' from source: magic vars 28173 1726882786.46969: variable 'ansible_distribution_major_version' from source: facts 28173 1726882786.47089: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882786.47311: variable 'network_state' from source: role '' defaults 28173 1726882786.47423: Evaluated conditional (network_state != {}): False 28173 1726882786.47431: when evaluation is False, skipping this task 28173 1726882786.47438: _execute() done 28173 1726882786.47445: dumping result to json 28173 1726882786.47451: done dumping result, returning 28173 1726882786.47460: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-926c-8928-0000000000e1] 28173 1726882786.47475: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e1 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882786.47619: no more pending results, returning what we have 28173 1726882786.47622: results queue empty 28173 1726882786.47623: checking for any_errors_fatal 28173 1726882786.47630: done checking for any_errors_fatal 28173 1726882786.47631: checking for max_fail_percentage 28173 1726882786.47633: done checking for max_fail_percentage 28173 1726882786.47634: checking to see if all hosts have failed and the running result is not ok 28173 1726882786.47634: done checking to see if all hosts have failed 28173 1726882786.47635: getting the remaining hosts for this loop 28173 1726882786.47636: done getting the remaining hosts for this loop 28173 1726882786.47639: getting the next task for host managed_node2 28173 1726882786.47645: done getting next task for host managed_node2 28173 1726882786.47649: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28173 1726882786.47653: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882786.47672: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e1 28173 1726882786.47676: WORKER PROCESS EXITING 28173 1726882786.47685: getting variables 28173 1726882786.47687: in VariableManager get_vars() 28173 1726882786.47726: Calling all_inventory to load vars for managed_node2 28173 1726882786.47729: Calling groups_inventory to load vars for managed_node2 28173 1726882786.47731: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882786.47744: Calling all_plugins_play to load vars for managed_node2 28173 1726882786.47747: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882786.47750: Calling groups_plugins_play to load vars for managed_node2 28173 1726882786.49988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882786.52630: done with get_vars() 28173 1726882786.52654: done getting variables 28173 1726882786.52721: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:39:46 -0400 (0:00:00.075) 0:00:39.691 ****** 28173 1726882786.52752: entering _queue_task() for managed_node2/service 28173 1726882786.53047: worker is 1 (out of 1 available) 28173 1726882786.53058: exiting _queue_task() for managed_node2/service 28173 1726882786.53072: done queuing things up, now waiting for results queue to drain 28173 1726882786.53073: waiting for pending results... 28173 1726882786.53333: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28173 1726882786.53493: in run() - task 0e448fcc-3ce9-926c-8928-0000000000e2 28173 1726882786.53526: variable 'ansible_search_path' from source: unknown 28173 1726882786.53539: variable 'ansible_search_path' from source: unknown 28173 1726882786.53638: calling self._execute() 28173 1726882786.54097: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882786.54165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882786.54229: variable 'omit' from source: magic vars 28173 1726882786.54936: variable 'ansible_distribution_major_version' from source: facts 28173 1726882786.54959: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882786.55355: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882786.56273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882786.59627: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882786.59709: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882786.59744: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882786.59780: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882786.59815: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882786.59890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.59938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.59961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.60003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.60021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.60061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.60084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.60115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.60163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.60179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.60223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.60250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.60277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.60315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.60334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.60865: variable 'network_connections' from source: play vars 28173 1726882786.60877: variable 'profile' from source: play vars 28173 1726882786.61039: variable 'profile' from source: play vars 28173 1726882786.61043: variable 'interface' from source: set_fact 28173 1726882786.61139: variable 'interface' from source: set_fact 28173 1726882786.61209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882786.61654: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882786.61750: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882786.61834: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882786.61896: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882786.61957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882786.61980: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882786.62009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.62041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882786.62103: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882786.62611: variable 'network_connections' from source: play vars 28173 1726882786.62616: variable 'profile' from source: play vars 28173 1726882786.62695: variable 'profile' from source: play vars 28173 1726882786.62702: variable 'interface' from source: set_fact 28173 1726882786.62837: variable 'interface' from source: set_fact 28173 1726882786.62861: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28173 1726882786.62868: when evaluation is False, skipping this task 28173 1726882786.62871: _execute() done 28173 1726882786.62873: dumping result to json 28173 1726882786.62875: done dumping result, returning 28173 1726882786.62883: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-926c-8928-0000000000e2] 28173 1726882786.62896: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e2 28173 1726882786.63007: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e2 28173 1726882786.63010: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28173 1726882786.63058: no more pending results, returning what we have 28173 1726882786.63063: results queue empty 28173 1726882786.63065: checking for any_errors_fatal 28173 1726882786.63072: done checking for any_errors_fatal 28173 1726882786.63073: checking for max_fail_percentage 28173 1726882786.63075: done checking for max_fail_percentage 28173 1726882786.63076: checking to see if all hosts have failed and the running result is not ok 28173 1726882786.63077: done checking to see if all hosts have failed 28173 1726882786.63078: getting the remaining hosts for this loop 28173 1726882786.63080: done getting the remaining hosts for this loop 28173 1726882786.63084: getting the next task for host managed_node2 28173 1726882786.63090: done getting next task for host managed_node2 28173 1726882786.63094: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28173 1726882786.63097: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882786.63111: getting variables 28173 1726882786.63113: in VariableManager get_vars() 28173 1726882786.63153: Calling all_inventory to load vars for managed_node2 28173 1726882786.63156: Calling groups_inventory to load vars for managed_node2 28173 1726882786.63159: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882786.63172: Calling all_plugins_play to load vars for managed_node2 28173 1726882786.63176: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882786.63179: Calling groups_plugins_play to load vars for managed_node2 28173 1726882786.65419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882786.69354: done with get_vars() 28173 1726882786.69775: done getting variables 28173 1726882786.69953: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:39:46 -0400 (0:00:00.173) 0:00:39.865 ****** 28173 1726882786.70070: entering _queue_task() for managed_node2/service 28173 1726882786.70456: worker is 1 (out of 1 available) 28173 1726882786.70472: exiting _queue_task() for managed_node2/service 28173 1726882786.70490: done queuing things up, now waiting for results queue to drain 28173 1726882786.70491: waiting for pending results... 28173 1726882786.70959: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28173 1726882786.71232: in run() - task 0e448fcc-3ce9-926c-8928-0000000000e3 28173 1726882786.71341: variable 'ansible_search_path' from source: unknown 28173 1726882786.71345: variable 'ansible_search_path' from source: unknown 28173 1726882786.71383: calling self._execute() 28173 1726882786.71562: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882786.71587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882786.71607: variable 'omit' from source: magic vars 28173 1726882786.72015: variable 'ansible_distribution_major_version' from source: facts 28173 1726882786.72032: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882786.72160: variable 'network_provider' from source: set_fact 28173 1726882786.72166: variable 'network_state' from source: role '' defaults 28173 1726882786.72177: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28173 1726882786.72183: variable 'omit' from source: magic vars 28173 1726882786.72213: variable 'omit' from source: magic vars 28173 1726882786.72235: variable 'network_service_name' from source: role '' defaults 28173 1726882786.72290: variable 'network_service_name' from source: role '' defaults 28173 1726882786.72397: variable '__network_provider_setup' from source: role '' defaults 28173 1726882786.72401: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882786.72466: variable '__network_service_name_default_nm' from source: role '' defaults 28173 1726882786.72475: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882786.72531: variable '__network_packages_default_nm' from source: role '' defaults 28173 1726882786.72708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882786.75616: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882786.75659: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882786.75713: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882786.75767: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882786.75791: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882786.75867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.75901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.75923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.75951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.75962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.76004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.76022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.76040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.76068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.76080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.76243: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28173 1726882786.76369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.76404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.76434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.76461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.76477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.76608: variable 'ansible_python' from source: facts 28173 1726882786.76611: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28173 1726882786.76677: variable '__network_wpa_supplicant_required' from source: role '' defaults 28173 1726882786.76749: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28173 1726882786.76868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.76886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.76909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.76933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.76942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.76977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882786.76996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882786.77017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.77055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882786.77067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882786.77158: variable 'network_connections' from source: play vars 28173 1726882786.77165: variable 'profile' from source: play vars 28173 1726882786.77218: variable 'profile' from source: play vars 28173 1726882786.77221: variable 'interface' from source: set_fact 28173 1726882786.77297: variable 'interface' from source: set_fact 28173 1726882786.77399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882786.77594: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882786.77636: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882786.77671: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882786.77749: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882786.77834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882786.77896: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882786.77944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882786.78001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882786.78076: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882786.78379: variable 'network_connections' from source: play vars 28173 1726882786.78389: variable 'profile' from source: play vars 28173 1726882786.78471: variable 'profile' from source: play vars 28173 1726882786.78487: variable 'interface' from source: set_fact 28173 1726882786.78560: variable 'interface' from source: set_fact 28173 1726882786.78599: variable '__network_packages_default_wireless' from source: role '' defaults 28173 1726882786.78699: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882786.79028: variable 'network_connections' from source: play vars 28173 1726882786.79038: variable 'profile' from source: play vars 28173 1726882786.79121: variable 'profile' from source: play vars 28173 1726882786.79130: variable 'interface' from source: set_fact 28173 1726882786.79220: variable 'interface' from source: set_fact 28173 1726882786.79255: variable '__network_packages_default_team' from source: role '' defaults 28173 1726882786.79342: variable '__network_team_connections_defined' from source: role '' defaults 28173 1726882786.79649: variable 'network_connections' from source: play vars 28173 1726882786.79658: variable 'profile' from source: play vars 28173 1726882786.79755: variable 'profile' from source: play vars 28173 1726882786.79769: variable 'interface' from source: set_fact 28173 1726882786.79859: variable 'interface' from source: set_fact 28173 1726882786.79926: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882786.79996: variable '__network_service_name_default_initscripts' from source: role '' defaults 28173 1726882786.80007: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882786.80079: variable '__network_packages_default_initscripts' from source: role '' defaults 28173 1726882786.80310: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28173 1726882786.80722: variable 'network_connections' from source: play vars 28173 1726882786.80726: variable 'profile' from source: play vars 28173 1726882786.80781: variable 'profile' from source: play vars 28173 1726882786.80790: variable 'interface' from source: set_fact 28173 1726882786.80838: variable 'interface' from source: set_fact 28173 1726882786.80861: variable 'ansible_distribution' from source: facts 28173 1726882786.80868: variable '__network_rh_distros' from source: role '' defaults 28173 1726882786.80871: variable 'ansible_distribution_major_version' from source: facts 28173 1726882786.80873: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28173 1726882786.81019: variable 'ansible_distribution' from source: facts 28173 1726882786.81027: variable '__network_rh_distros' from source: role '' defaults 28173 1726882786.81035: variable 'ansible_distribution_major_version' from source: facts 28173 1726882786.81050: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28173 1726882786.81266: variable 'ansible_distribution' from source: facts 28173 1726882786.81272: variable '__network_rh_distros' from source: role '' defaults 28173 1726882786.81274: variable 'ansible_distribution_major_version' from source: facts 28173 1726882786.81293: variable 'network_provider' from source: set_fact 28173 1726882786.81301: variable 'omit' from source: magic vars 28173 1726882786.81327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882786.81361: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882786.81380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882786.81397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882786.81407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882786.81436: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882786.81448: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882786.81453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882786.81560: Set connection var ansible_pipelining to False 28173 1726882786.81568: Set connection var ansible_shell_type to sh 28173 1726882786.81574: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882786.81582: Set connection var ansible_timeout to 10 28173 1726882786.81588: Set connection var ansible_shell_executable to /bin/sh 28173 1726882786.81592: Set connection var ansible_connection to ssh 28173 1726882786.81617: variable 'ansible_shell_executable' from source: unknown 28173 1726882786.81620: variable 'ansible_connection' from source: unknown 28173 1726882786.81623: variable 'ansible_module_compression' from source: unknown 28173 1726882786.81626: variable 'ansible_shell_type' from source: unknown 28173 1726882786.81628: variable 'ansible_shell_executable' from source: unknown 28173 1726882786.81630: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882786.81635: variable 'ansible_pipelining' from source: unknown 28173 1726882786.81638: variable 'ansible_timeout' from source: unknown 28173 1726882786.81640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882786.81743: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882786.81752: variable 'omit' from source: magic vars 28173 1726882786.81757: starting attempt loop 28173 1726882786.81760: running the handler 28173 1726882786.81841: variable 'ansible_facts' from source: unknown 28173 1726882786.82624: _low_level_execute_command(): starting 28173 1726882786.82628: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882786.83109: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882786.83123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882786.83140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882786.83151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882786.83210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882786.83216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882786.83331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882786.84986: stdout chunk (state=3): >>>/root <<< 28173 1726882786.85147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882786.85157: stdout chunk (state=3): >>><<< 28173 1726882786.85171: stderr chunk (state=3): >>><<< 28173 1726882786.85194: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882786.85210: _low_level_execute_command(): starting 28173 1726882786.85227: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882786.8519964-29923-217293947849802 `" && echo ansible-tmp-1726882786.8519964-29923-217293947849802="` echo /root/.ansible/tmp/ansible-tmp-1726882786.8519964-29923-217293947849802 `" ) && sleep 0' 28173 1726882786.85846: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882786.85860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882786.85883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882786.85900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882786.85939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882786.85954: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882786.85970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882786.85995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882786.86007: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882786.86018: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882786.86030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882786.86042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882786.86056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882786.86072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882786.86085: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882786.86106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882786.86197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882786.86203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882786.86312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882786.88222: stdout chunk (state=3): >>>ansible-tmp-1726882786.8519964-29923-217293947849802=/root/.ansible/tmp/ansible-tmp-1726882786.8519964-29923-217293947849802 <<< 28173 1726882786.88333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882786.88383: stderr chunk (state=3): >>><<< 28173 1726882786.88387: stdout chunk (state=3): >>><<< 28173 1726882786.88399: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882786.8519964-29923-217293947849802=/root/.ansible/tmp/ansible-tmp-1726882786.8519964-29923-217293947849802 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882786.88423: variable 'ansible_module_compression' from source: unknown 28173 1726882786.88462: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28173 1726882786.88517: variable 'ansible_facts' from source: unknown 28173 1726882786.88653: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882786.8519964-29923-217293947849802/AnsiballZ_systemd.py 28173 1726882786.88759: Sending initial data 28173 1726882786.88772: Sent initial data (156 bytes) 28173 1726882786.89396: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882786.89403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882786.89434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882786.89456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882786.89459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882786.89461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882786.89511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882786.89516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882786.89633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882786.91386: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28173 1726882786.91393: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 28173 1726882786.91399: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 28173 1726882786.91404: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 28173 1726882786.91409: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 28173 1726882786.91416: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882786.91524: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882786.91624: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmp_g3c896k /root/.ansible/tmp/ansible-tmp-1726882786.8519964-29923-217293947849802/AnsiballZ_systemd.py <<< 28173 1726882786.91719: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882786.93728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882786.93816: stderr chunk (state=3): >>><<< 28173 1726882786.93819: stdout chunk (state=3): >>><<< 28173 1726882786.93835: done transferring module to remote 28173 1726882786.93844: _low_level_execute_command(): starting 28173 1726882786.93847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882786.8519964-29923-217293947849802/ /root/.ansible/tmp/ansible-tmp-1726882786.8519964-29923-217293947849802/AnsiballZ_systemd.py && sleep 0' 28173 1726882786.94276: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882786.94284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882786.94313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882786.94318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 28173 1726882786.94326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882786.94332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882786.94340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882786.94348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882786.94353: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882786.94358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882786.94420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882786.94423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882786.94433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882786.94540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882786.96344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882786.96391: stderr chunk (state=3): >>><<< 28173 1726882786.96394: stdout chunk (state=3): >>><<< 28173 1726882786.96406: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882786.96409: _low_level_execute_command(): starting 28173 1726882786.96411: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882786.8519964-29923-217293947849802/AnsiballZ_systemd.py && sleep 0' 28173 1726882786.96816: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882786.96821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882786.96852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882786.96874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882786.96885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882786.96924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882786.96936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882786.97047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882787.21971: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 28173 1726882787.21990: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "9191424", "MemoryAvailable": "infinity", "CPUUsageNSec": "2019396000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 28173 1726882787.21995: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28173 1726882787.23583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882787.23605: stderr chunk (state=3): >>><<< 28173 1726882787.23608: stdout chunk (state=3): >>><<< 28173 1726882787.23891: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9191424", "MemoryAvailable": "infinity", "CPUUsageNSec": "2019396000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882787.23900: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882786.8519964-29923-217293947849802/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882787.23903: _low_level_execute_command(): starting 28173 1726882787.23905: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882786.8519964-29923-217293947849802/ > /dev/null 2>&1 && sleep 0' 28173 1726882787.24468: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882787.24482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882787.24498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882787.24516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882787.24558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882787.24578: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882787.24592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882787.24608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882787.24619: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882787.24629: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882787.24640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882787.24651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882787.24670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882787.24684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882787.24694: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882787.24705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882787.24782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882787.24798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882787.24811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882787.24954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882787.26785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882787.26856: stderr chunk (state=3): >>><<< 28173 1726882787.26862: stdout chunk (state=3): >>><<< 28173 1726882787.26884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882787.26891: handler run complete 28173 1726882787.26951: attempt loop complete, returning result 28173 1726882787.26954: _execute() done 28173 1726882787.26956: dumping result to json 28173 1726882787.26978: done dumping result, returning 28173 1726882787.26987: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-926c-8928-0000000000e3] 28173 1726882787.26993: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e3 28173 1726882787.27238: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e3 28173 1726882787.27242: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882787.27301: no more pending results, returning what we have 28173 1726882787.27305: results queue empty 28173 1726882787.27305: checking for any_errors_fatal 28173 1726882787.27313: done checking for any_errors_fatal 28173 1726882787.27314: checking for max_fail_percentage 28173 1726882787.27316: done checking for max_fail_percentage 28173 1726882787.27317: checking to see if all hosts have failed and the running result is not ok 28173 1726882787.27318: done checking to see if all hosts have failed 28173 1726882787.27319: getting the remaining hosts for this loop 28173 1726882787.27320: done getting the remaining hosts for this loop 28173 1726882787.27324: getting the next task for host managed_node2 28173 1726882787.27330: done getting next task for host managed_node2 28173 1726882787.27334: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28173 1726882787.27336: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882787.27346: getting variables 28173 1726882787.27348: in VariableManager get_vars() 28173 1726882787.27393: Calling all_inventory to load vars for managed_node2 28173 1726882787.27395: Calling groups_inventory to load vars for managed_node2 28173 1726882787.27398: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882787.27409: Calling all_plugins_play to load vars for managed_node2 28173 1726882787.27412: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882787.27416: Calling groups_plugins_play to load vars for managed_node2 28173 1726882787.29242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882787.31148: done with get_vars() 28173 1726882787.31185: done getting variables 28173 1726882787.31247: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:39:47 -0400 (0:00:00.612) 0:00:40.477 ****** 28173 1726882787.31296: entering _queue_task() for managed_node2/service 28173 1726882787.31615: worker is 1 (out of 1 available) 28173 1726882787.31627: exiting _queue_task() for managed_node2/service 28173 1726882787.31641: done queuing things up, now waiting for results queue to drain 28173 1726882787.31642: waiting for pending results... 28173 1726882787.31957: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28173 1726882787.32080: in run() - task 0e448fcc-3ce9-926c-8928-0000000000e4 28173 1726882787.32095: variable 'ansible_search_path' from source: unknown 28173 1726882787.32099: variable 'ansible_search_path' from source: unknown 28173 1726882787.32138: calling self._execute() 28173 1726882787.32249: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882787.32259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882787.32276: variable 'omit' from source: magic vars 28173 1726882787.32696: variable 'ansible_distribution_major_version' from source: facts 28173 1726882787.32709: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882787.32837: variable 'network_provider' from source: set_fact 28173 1726882787.32840: Evaluated conditional (network_provider == "nm"): True 28173 1726882787.32949: variable '__network_wpa_supplicant_required' from source: role '' defaults 28173 1726882787.33051: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28173 1726882787.33257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882787.35340: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882787.35387: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882787.35415: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882787.35440: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882787.35460: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882787.35522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882787.35541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882787.35560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882787.35592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882787.35603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882787.35635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882787.35651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882787.35674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882787.35701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882787.35711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882787.35741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882787.35758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882787.35778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882787.35805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882787.35813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882787.35906: variable 'network_connections' from source: play vars 28173 1726882787.35914: variable 'profile' from source: play vars 28173 1726882787.35989: variable 'profile' from source: play vars 28173 1726882787.35992: variable 'interface' from source: set_fact 28173 1726882787.36034: variable 'interface' from source: set_fact 28173 1726882787.36115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28173 1726882787.36297: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28173 1726882787.36301: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28173 1726882787.36347: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28173 1726882787.36358: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28173 1726882787.36412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28173 1726882787.36424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28173 1726882787.36448: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882787.36478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28173 1726882787.36525: variable '__network_wireless_connections_defined' from source: role '' defaults 28173 1726882787.36754: variable 'network_connections' from source: play vars 28173 1726882787.36757: variable 'profile' from source: play vars 28173 1726882787.36819: variable 'profile' from source: play vars 28173 1726882787.36822: variable 'interface' from source: set_fact 28173 1726882787.36883: variable 'interface' from source: set_fact 28173 1726882787.36909: Evaluated conditional (__network_wpa_supplicant_required): False 28173 1726882787.36912: when evaluation is False, skipping this task 28173 1726882787.36915: _execute() done 28173 1726882787.36926: dumping result to json 28173 1726882787.36928: done dumping result, returning 28173 1726882787.36931: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-926c-8928-0000000000e4] 28173 1726882787.36933: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e4 28173 1726882787.37020: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e4 28173 1726882787.37023: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28173 1726882787.37077: no more pending results, returning what we have 28173 1726882787.37080: results queue empty 28173 1726882787.37081: checking for any_errors_fatal 28173 1726882787.37095: done checking for any_errors_fatal 28173 1726882787.37096: checking for max_fail_percentage 28173 1726882787.37097: done checking for max_fail_percentage 28173 1726882787.37098: checking to see if all hosts have failed and the running result is not ok 28173 1726882787.37099: done checking to see if all hosts have failed 28173 1726882787.37100: getting the remaining hosts for this loop 28173 1726882787.37101: done getting the remaining hosts for this loop 28173 1726882787.37105: getting the next task for host managed_node2 28173 1726882787.37110: done getting next task for host managed_node2 28173 1726882787.37114: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28173 1726882787.37116: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882787.37128: getting variables 28173 1726882787.37130: in VariableManager get_vars() 28173 1726882787.37167: Calling all_inventory to load vars for managed_node2 28173 1726882787.37170: Calling groups_inventory to load vars for managed_node2 28173 1726882787.37172: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882787.37182: Calling all_plugins_play to load vars for managed_node2 28173 1726882787.37185: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882787.37188: Calling groups_plugins_play to load vars for managed_node2 28173 1726882787.38303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882787.39274: done with get_vars() 28173 1726882787.39290: done getting variables 28173 1726882787.39333: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:39:47 -0400 (0:00:00.080) 0:00:40.557 ****** 28173 1726882787.39357: entering _queue_task() for managed_node2/service 28173 1726882787.39599: worker is 1 (out of 1 available) 28173 1726882787.39612: exiting _queue_task() for managed_node2/service 28173 1726882787.39623: done queuing things up, now waiting for results queue to drain 28173 1726882787.39624: waiting for pending results... 28173 1726882787.39912: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28173 1726882787.40006: in run() - task 0e448fcc-3ce9-926c-8928-0000000000e5 28173 1726882787.40017: variable 'ansible_search_path' from source: unknown 28173 1726882787.40021: variable 'ansible_search_path' from source: unknown 28173 1726882787.40057: calling self._execute() 28173 1726882787.40161: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882787.40168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882787.40186: variable 'omit' from source: magic vars 28173 1726882787.40565: variable 'ansible_distribution_major_version' from source: facts 28173 1726882787.40580: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882787.40696: variable 'network_provider' from source: set_fact 28173 1726882787.40701: Evaluated conditional (network_provider == "initscripts"): False 28173 1726882787.40704: when evaluation is False, skipping this task 28173 1726882787.40707: _execute() done 28173 1726882787.40709: dumping result to json 28173 1726882787.40712: done dumping result, returning 28173 1726882787.40724: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-926c-8928-0000000000e5] 28173 1726882787.40732: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e5 28173 1726882787.40821: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e5 28173 1726882787.40824: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28173 1726882787.40873: no more pending results, returning what we have 28173 1726882787.40877: results queue empty 28173 1726882787.40878: checking for any_errors_fatal 28173 1726882787.40888: done checking for any_errors_fatal 28173 1726882787.40889: checking for max_fail_percentage 28173 1726882787.40890: done checking for max_fail_percentage 28173 1726882787.40892: checking to see if all hosts have failed and the running result is not ok 28173 1726882787.40892: done checking to see if all hosts have failed 28173 1726882787.40893: getting the remaining hosts for this loop 28173 1726882787.40895: done getting the remaining hosts for this loop 28173 1726882787.40898: getting the next task for host managed_node2 28173 1726882787.40904: done getting next task for host managed_node2 28173 1726882787.40908: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28173 1726882787.40911: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882787.40926: getting variables 28173 1726882787.40928: in VariableManager get_vars() 28173 1726882787.40971: Calling all_inventory to load vars for managed_node2 28173 1726882787.40974: Calling groups_inventory to load vars for managed_node2 28173 1726882787.40977: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882787.40990: Calling all_plugins_play to load vars for managed_node2 28173 1726882787.40994: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882787.40997: Calling groups_plugins_play to load vars for managed_node2 28173 1726882787.42424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882787.47716: done with get_vars() 28173 1726882787.47741: done getting variables 28173 1726882787.47797: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:39:47 -0400 (0:00:00.084) 0:00:40.642 ****** 28173 1726882787.47823: entering _queue_task() for managed_node2/copy 28173 1726882787.48141: worker is 1 (out of 1 available) 28173 1726882787.48154: exiting _queue_task() for managed_node2/copy 28173 1726882787.48170: done queuing things up, now waiting for results queue to drain 28173 1726882787.48171: waiting for pending results... 28173 1726882787.48382: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28173 1726882787.48459: in run() - task 0e448fcc-3ce9-926c-8928-0000000000e6 28173 1726882787.48476: variable 'ansible_search_path' from source: unknown 28173 1726882787.48481: variable 'ansible_search_path' from source: unknown 28173 1726882787.48515: calling self._execute() 28173 1726882787.48594: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882787.48597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882787.48608: variable 'omit' from source: magic vars 28173 1726882787.48895: variable 'ansible_distribution_major_version' from source: facts 28173 1726882787.48906: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882787.48989: variable 'network_provider' from source: set_fact 28173 1726882787.48993: Evaluated conditional (network_provider == "initscripts"): False 28173 1726882787.48996: when evaluation is False, skipping this task 28173 1726882787.48999: _execute() done 28173 1726882787.49002: dumping result to json 28173 1726882787.49004: done dumping result, returning 28173 1726882787.49013: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-926c-8928-0000000000e6] 28173 1726882787.49016: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e6 28173 1726882787.49113: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e6 28173 1726882787.49116: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28173 1726882787.49170: no more pending results, returning what we have 28173 1726882787.49174: results queue empty 28173 1726882787.49174: checking for any_errors_fatal 28173 1726882787.49182: done checking for any_errors_fatal 28173 1726882787.49183: checking for max_fail_percentage 28173 1726882787.49185: done checking for max_fail_percentage 28173 1726882787.49186: checking to see if all hosts have failed and the running result is not ok 28173 1726882787.49187: done checking to see if all hosts have failed 28173 1726882787.49187: getting the remaining hosts for this loop 28173 1726882787.49189: done getting the remaining hosts for this loop 28173 1726882787.49192: getting the next task for host managed_node2 28173 1726882787.49196: done getting next task for host managed_node2 28173 1726882787.49200: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28173 1726882787.49202: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882787.49215: getting variables 28173 1726882787.49217: in VariableManager get_vars() 28173 1726882787.49257: Calling all_inventory to load vars for managed_node2 28173 1726882787.49260: Calling groups_inventory to load vars for managed_node2 28173 1726882787.49263: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882787.49273: Calling all_plugins_play to load vars for managed_node2 28173 1726882787.49276: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882787.49279: Calling groups_plugins_play to load vars for managed_node2 28173 1726882787.50242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882787.51609: done with get_vars() 28173 1726882787.51624: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:39:47 -0400 (0:00:00.038) 0:00:40.681 ****** 28173 1726882787.51682: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28173 1726882787.51871: worker is 1 (out of 1 available) 28173 1726882787.51885: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28173 1726882787.51896: done queuing things up, now waiting for results queue to drain 28173 1726882787.51897: waiting for pending results... 28173 1726882787.52081: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28173 1726882787.52156: in run() - task 0e448fcc-3ce9-926c-8928-0000000000e7 28173 1726882787.52172: variable 'ansible_search_path' from source: unknown 28173 1726882787.52176: variable 'ansible_search_path' from source: unknown 28173 1726882787.52204: calling self._execute() 28173 1726882787.52282: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882787.52286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882787.52297: variable 'omit' from source: magic vars 28173 1726882787.52572: variable 'ansible_distribution_major_version' from source: facts 28173 1726882787.52580: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882787.52586: variable 'omit' from source: magic vars 28173 1726882787.52615: variable 'omit' from source: magic vars 28173 1726882787.52727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28173 1726882787.54456: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28173 1726882787.54503: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28173 1726882787.54538: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28173 1726882787.54565: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28173 1726882787.54588: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28173 1726882787.54640: variable 'network_provider' from source: set_fact 28173 1726882787.54730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28173 1726882787.54750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28173 1726882787.54771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28173 1726882787.54797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28173 1726882787.54814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28173 1726882787.54860: variable 'omit' from source: magic vars 28173 1726882787.54937: variable 'omit' from source: magic vars 28173 1726882787.55006: variable 'network_connections' from source: play vars 28173 1726882787.55014: variable 'profile' from source: play vars 28173 1726882787.55066: variable 'profile' from source: play vars 28173 1726882787.55072: variable 'interface' from source: set_fact 28173 1726882787.55113: variable 'interface' from source: set_fact 28173 1726882787.55214: variable 'omit' from source: magic vars 28173 1726882787.55220: variable '__lsr_ansible_managed' from source: task vars 28173 1726882787.55269: variable '__lsr_ansible_managed' from source: task vars 28173 1726882787.55516: Loaded config def from plugin (lookup/template) 28173 1726882787.55520: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28173 1726882787.55523: File lookup term: get_ansible_managed.j2 28173 1726882787.55526: variable 'ansible_search_path' from source: unknown 28173 1726882787.55529: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28173 1726882787.55540: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28173 1726882787.55552: variable 'ansible_search_path' from source: unknown 28173 1726882787.60451: variable 'ansible_managed' from source: unknown 28173 1726882787.60537: variable 'omit' from source: magic vars 28173 1726882787.60558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882787.60585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882787.60598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882787.60612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882787.60620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882787.60641: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882787.60645: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882787.60648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882787.60717: Set connection var ansible_pipelining to False 28173 1726882787.60720: Set connection var ansible_shell_type to sh 28173 1726882787.60726: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882787.60733: Set connection var ansible_timeout to 10 28173 1726882787.60738: Set connection var ansible_shell_executable to /bin/sh 28173 1726882787.60742: Set connection var ansible_connection to ssh 28173 1726882787.60758: variable 'ansible_shell_executable' from source: unknown 28173 1726882787.60761: variable 'ansible_connection' from source: unknown 28173 1726882787.60765: variable 'ansible_module_compression' from source: unknown 28173 1726882787.60768: variable 'ansible_shell_type' from source: unknown 28173 1726882787.60771: variable 'ansible_shell_executable' from source: unknown 28173 1726882787.60775: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882787.60779: variable 'ansible_pipelining' from source: unknown 28173 1726882787.60781: variable 'ansible_timeout' from source: unknown 28173 1726882787.60788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882787.60876: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882787.60887: variable 'omit' from source: magic vars 28173 1726882787.60890: starting attempt loop 28173 1726882787.60894: running the handler 28173 1726882787.60912: _low_level_execute_command(): starting 28173 1726882787.60915: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882787.61396: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882787.61411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882787.61424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882787.61435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882787.61451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882787.61494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882787.61505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882787.61620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882787.63469: stdout chunk (state=3): >>>/root <<< 28173 1726882787.63733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882787.63808: stderr chunk (state=3): >>><<< 28173 1726882787.63814: stdout chunk (state=3): >>><<< 28173 1726882787.63839: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882787.63851: _low_level_execute_command(): starting 28173 1726882787.63857: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882787.6383939-29949-211952265202906 `" && echo ansible-tmp-1726882787.6383939-29949-211952265202906="` echo /root/.ansible/tmp/ansible-tmp-1726882787.6383939-29949-211952265202906 `" ) && sleep 0' 28173 1726882787.64461: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882787.64476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882787.64485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882787.64498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882787.64537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882787.64544: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882787.64554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882787.64574: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882787.64582: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882787.64588: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882787.64596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882787.64605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882787.64616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882787.64623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882787.64631: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882787.64640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882787.64713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882787.64731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882787.64738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882787.65380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882787.66890: stdout chunk (state=3): >>>ansible-tmp-1726882787.6383939-29949-211952265202906=/root/.ansible/tmp/ansible-tmp-1726882787.6383939-29949-211952265202906 <<< 28173 1726882787.67074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882787.67078: stdout chunk (state=3): >>><<< 28173 1726882787.67086: stderr chunk (state=3): >>><<< 28173 1726882787.67105: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882787.6383939-29949-211952265202906=/root/.ansible/tmp/ansible-tmp-1726882787.6383939-29949-211952265202906 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882787.67151: variable 'ansible_module_compression' from source: unknown 28173 1726882787.67205: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28173 1726882787.67237: variable 'ansible_facts' from source: unknown 28173 1726882787.67328: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882787.6383939-29949-211952265202906/AnsiballZ_network_connections.py 28173 1726882787.67493: Sending initial data 28173 1726882787.67496: Sent initial data (168 bytes) 28173 1726882787.68436: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882787.68445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882787.68455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882787.68475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882787.68507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882787.68515: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882787.68526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882787.68543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882787.68548: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882787.68551: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882787.68558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882787.68566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882787.68587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882787.68594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882787.68601: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882787.68612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882787.68696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882787.68700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882787.68709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882787.69074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882787.70843: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882787.70940: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882787.71033: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpdyjdb0q5 /root/.ansible/tmp/ansible-tmp-1726882787.6383939-29949-211952265202906/AnsiballZ_network_connections.py <<< 28173 1726882787.71129: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882787.73402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882787.73477: stderr chunk (state=3): >>><<< 28173 1726882787.73481: stdout chunk (state=3): >>><<< 28173 1726882787.73502: done transferring module to remote 28173 1726882787.73513: _low_level_execute_command(): starting 28173 1726882787.73518: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882787.6383939-29949-211952265202906/ /root/.ansible/tmp/ansible-tmp-1726882787.6383939-29949-211952265202906/AnsiballZ_network_connections.py && sleep 0' 28173 1726882787.74981: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882787.74988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882787.75144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882787.75262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882787.75319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882787.75322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882787.75452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882787.77282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882787.77350: stderr chunk (state=3): >>><<< 28173 1726882787.77353: stdout chunk (state=3): >>><<< 28173 1726882787.77444: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882787.77449: _low_level_execute_command(): starting 28173 1726882787.77452: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882787.6383939-29949-211952265202906/AnsiballZ_network_connections.py && sleep 0' 28173 1726882787.78026: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882787.78046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882787.78061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882787.78083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882787.78126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882787.78137: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882787.78158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882787.78187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882787.78198: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882787.78208: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882787.78224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882787.78236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882787.78250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882787.78261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882787.78278: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882787.78291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882787.78377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882787.78393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882787.78413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882787.78686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882788.02817: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7teyxv00/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7teyxv00/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 28173 1726882788.02821: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/9ea3c671-64f7-45cb-8f46-d70830299b65: error=unknown <<< 28173 1726882788.02987: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28173 1726882788.04607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882788.04610: stdout chunk (state=3): >>><<< 28173 1726882788.04615: stderr chunk (state=3): >>><<< 28173 1726882788.04755: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7teyxv00/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7teyxv00/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/9ea3c671-64f7-45cb-8f46-d70830299b65: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882788.04759: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882787.6383939-29949-211952265202906/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882788.04762: _low_level_execute_command(): starting 28173 1726882788.04767: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882787.6383939-29949-211952265202906/ > /dev/null 2>&1 && sleep 0' 28173 1726882788.05433: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882788.05446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.05460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.05486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.05525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.05537: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882788.05551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.05577: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882788.05590: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882788.05602: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882788.05621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.05635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.05651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.05668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.05681: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882788.05700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.05775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882788.05985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882788.05988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882788.06096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882788.07914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882788.07988: stderr chunk (state=3): >>><<< 28173 1726882788.07991: stdout chunk (state=3): >>><<< 28173 1726882788.08274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882788.08278: handler run complete 28173 1726882788.08280: attempt loop complete, returning result 28173 1726882788.08283: _execute() done 28173 1726882788.08284: dumping result to json 28173 1726882788.08286: done dumping result, returning 28173 1726882788.08291: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-926c-8928-0000000000e7] 28173 1726882788.08294: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e7 28173 1726882788.08362: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e7 28173 1726882788.08371: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 28173 1726882788.08480: no more pending results, returning what we have 28173 1726882788.08483: results queue empty 28173 1726882788.08484: checking for any_errors_fatal 28173 1726882788.08489: done checking for any_errors_fatal 28173 1726882788.08490: checking for max_fail_percentage 28173 1726882788.08491: done checking for max_fail_percentage 28173 1726882788.08492: checking to see if all hosts have failed and the running result is not ok 28173 1726882788.08493: done checking to see if all hosts have failed 28173 1726882788.08493: getting the remaining hosts for this loop 28173 1726882788.08495: done getting the remaining hosts for this loop 28173 1726882788.08498: getting the next task for host managed_node2 28173 1726882788.08503: done getting next task for host managed_node2 28173 1726882788.08507: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28173 1726882788.08509: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882788.08518: getting variables 28173 1726882788.08519: in VariableManager get_vars() 28173 1726882788.08553: Calling all_inventory to load vars for managed_node2 28173 1726882788.08555: Calling groups_inventory to load vars for managed_node2 28173 1726882788.08558: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882788.08570: Calling all_plugins_play to load vars for managed_node2 28173 1726882788.08573: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882788.08576: Calling groups_plugins_play to load vars for managed_node2 28173 1726882788.10801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882788.12680: done with get_vars() 28173 1726882788.12706: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:39:48 -0400 (0:00:00.611) 0:00:41.292 ****** 28173 1726882788.12797: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28173 1726882788.13143: worker is 1 (out of 1 available) 28173 1726882788.13154: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28173 1726882788.13173: done queuing things up, now waiting for results queue to drain 28173 1726882788.13174: waiting for pending results... 28173 1726882788.13458: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28173 1726882788.13586: in run() - task 0e448fcc-3ce9-926c-8928-0000000000e8 28173 1726882788.13611: variable 'ansible_search_path' from source: unknown 28173 1726882788.13620: variable 'ansible_search_path' from source: unknown 28173 1726882788.13658: calling self._execute() 28173 1726882788.13774: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882788.13786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882788.13800: variable 'omit' from source: magic vars 28173 1726882788.14200: variable 'ansible_distribution_major_version' from source: facts 28173 1726882788.14218: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882788.14342: variable 'network_state' from source: role '' defaults 28173 1726882788.14357: Evaluated conditional (network_state != {}): False 28173 1726882788.14374: when evaluation is False, skipping this task 28173 1726882788.14384: _execute() done 28173 1726882788.14390: dumping result to json 28173 1726882788.14396: done dumping result, returning 28173 1726882788.14405: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-926c-8928-0000000000e8] 28173 1726882788.14415: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e8 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28173 1726882788.14576: no more pending results, returning what we have 28173 1726882788.14580: results queue empty 28173 1726882788.14581: checking for any_errors_fatal 28173 1726882788.14592: done checking for any_errors_fatal 28173 1726882788.14593: checking for max_fail_percentage 28173 1726882788.14595: done checking for max_fail_percentage 28173 1726882788.14596: checking to see if all hosts have failed and the running result is not ok 28173 1726882788.14596: done checking to see if all hosts have failed 28173 1726882788.14597: getting the remaining hosts for this loop 28173 1726882788.14599: done getting the remaining hosts for this loop 28173 1726882788.14602: getting the next task for host managed_node2 28173 1726882788.14608: done getting next task for host managed_node2 28173 1726882788.14612: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28173 1726882788.14615: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882788.14633: getting variables 28173 1726882788.14635: in VariableManager get_vars() 28173 1726882788.14683: Calling all_inventory to load vars for managed_node2 28173 1726882788.14686: Calling groups_inventory to load vars for managed_node2 28173 1726882788.14689: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882788.14702: Calling all_plugins_play to load vars for managed_node2 28173 1726882788.14705: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882788.14708: Calling groups_plugins_play to load vars for managed_node2 28173 1726882788.15730: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e8 28173 1726882788.15733: WORKER PROCESS EXITING 28173 1726882788.16599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882788.18590: done with get_vars() 28173 1726882788.18614: done getting variables 28173 1726882788.18683: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:39:48 -0400 (0:00:00.059) 0:00:41.351 ****** 28173 1726882788.18716: entering _queue_task() for managed_node2/debug 28173 1726882788.19059: worker is 1 (out of 1 available) 28173 1726882788.19075: exiting _queue_task() for managed_node2/debug 28173 1726882788.19088: done queuing things up, now waiting for results queue to drain 28173 1726882788.19089: waiting for pending results... 28173 1726882788.19390: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28173 1726882788.19510: in run() - task 0e448fcc-3ce9-926c-8928-0000000000e9 28173 1726882788.19537: variable 'ansible_search_path' from source: unknown 28173 1726882788.19549: variable 'ansible_search_path' from source: unknown 28173 1726882788.19596: calling self._execute() 28173 1726882788.19710: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882788.19727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882788.19750: variable 'omit' from source: magic vars 28173 1726882788.20162: variable 'ansible_distribution_major_version' from source: facts 28173 1726882788.20189: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882788.20200: variable 'omit' from source: magic vars 28173 1726882788.20243: variable 'omit' from source: magic vars 28173 1726882788.20291: variable 'omit' from source: magic vars 28173 1726882788.20391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882788.20519: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882788.20550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882788.20577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882788.20594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882788.20734: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882788.20742: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882788.20754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882788.20979: Set connection var ansible_pipelining to False 28173 1726882788.20987: Set connection var ansible_shell_type to sh 28173 1726882788.20999: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882788.21010: Set connection var ansible_timeout to 10 28173 1726882788.21018: Set connection var ansible_shell_executable to /bin/sh 28173 1726882788.21025: Set connection var ansible_connection to ssh 28173 1726882788.21085: variable 'ansible_shell_executable' from source: unknown 28173 1726882788.21093: variable 'ansible_connection' from source: unknown 28173 1726882788.21100: variable 'ansible_module_compression' from source: unknown 28173 1726882788.21106: variable 'ansible_shell_type' from source: unknown 28173 1726882788.21111: variable 'ansible_shell_executable' from source: unknown 28173 1726882788.21117: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882788.21124: variable 'ansible_pipelining' from source: unknown 28173 1726882788.21163: variable 'ansible_timeout' from source: unknown 28173 1726882788.21176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882788.21437: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882788.21501: variable 'omit' from source: magic vars 28173 1726882788.21516: starting attempt loop 28173 1726882788.21522: running the handler 28173 1726882788.21827: variable '__network_connections_result' from source: set_fact 28173 1726882788.21951: handler run complete 28173 1726882788.21976: attempt loop complete, returning result 28173 1726882788.22016: _execute() done 28173 1726882788.22025: dumping result to json 28173 1726882788.22036: done dumping result, returning 28173 1726882788.22048: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-926c-8928-0000000000e9] 28173 1726882788.22065: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e9 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 28173 1726882788.22247: no more pending results, returning what we have 28173 1726882788.22250: results queue empty 28173 1726882788.22251: checking for any_errors_fatal 28173 1726882788.22260: done checking for any_errors_fatal 28173 1726882788.22261: checking for max_fail_percentage 28173 1726882788.22277: done checking for max_fail_percentage 28173 1726882788.22279: checking to see if all hosts have failed and the running result is not ok 28173 1726882788.22280: done checking to see if all hosts have failed 28173 1726882788.22281: getting the remaining hosts for this loop 28173 1726882788.22282: done getting the remaining hosts for this loop 28173 1726882788.22287: getting the next task for host managed_node2 28173 1726882788.22293: done getting next task for host managed_node2 28173 1726882788.22297: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28173 1726882788.22299: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882788.22311: getting variables 28173 1726882788.22313: in VariableManager get_vars() 28173 1726882788.22352: Calling all_inventory to load vars for managed_node2 28173 1726882788.22354: Calling groups_inventory to load vars for managed_node2 28173 1726882788.22357: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882788.22371: Calling all_plugins_play to load vars for managed_node2 28173 1726882788.22375: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882788.22378: Calling groups_plugins_play to load vars for managed_node2 28173 1726882788.23404: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000e9 28173 1726882788.23408: WORKER PROCESS EXITING 28173 1726882788.25523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882788.29556: done with get_vars() 28173 1726882788.29596: done getting variables 28173 1726882788.29760: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:39:48 -0400 (0:00:00.110) 0:00:41.462 ****** 28173 1726882788.29799: entering _queue_task() for managed_node2/debug 28173 1726882788.31064: worker is 1 (out of 1 available) 28173 1726882788.31079: exiting _queue_task() for managed_node2/debug 28173 1726882788.31092: done queuing things up, now waiting for results queue to drain 28173 1726882788.31094: waiting for pending results... 28173 1726882788.31985: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28173 1726882788.32105: in run() - task 0e448fcc-3ce9-926c-8928-0000000000ea 28173 1726882788.32129: variable 'ansible_search_path' from source: unknown 28173 1726882788.32139: variable 'ansible_search_path' from source: unknown 28173 1726882788.32188: calling self._execute() 28173 1726882788.32305: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882788.32324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882788.32342: variable 'omit' from source: magic vars 28173 1726882788.32738: variable 'ansible_distribution_major_version' from source: facts 28173 1726882788.32884: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882788.32896: variable 'omit' from source: magic vars 28173 1726882788.32942: variable 'omit' from source: magic vars 28173 1726882788.33013: variable 'omit' from source: magic vars 28173 1726882788.33115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882788.33222: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882788.33320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882788.33341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882788.33357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882788.33391: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882788.33433: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882788.33440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882788.33655: Set connection var ansible_pipelining to False 28173 1726882788.33724: Set connection var ansible_shell_type to sh 28173 1726882788.33746: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882788.33758: Set connection var ansible_timeout to 10 28173 1726882788.33768: Set connection var ansible_shell_executable to /bin/sh 28173 1726882788.33777: Set connection var ansible_connection to ssh 28173 1726882788.33801: variable 'ansible_shell_executable' from source: unknown 28173 1726882788.33838: variable 'ansible_connection' from source: unknown 28173 1726882788.33850: variable 'ansible_module_compression' from source: unknown 28173 1726882788.33963: variable 'ansible_shell_type' from source: unknown 28173 1726882788.33973: variable 'ansible_shell_executable' from source: unknown 28173 1726882788.33980: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882788.33987: variable 'ansible_pipelining' from source: unknown 28173 1726882788.33993: variable 'ansible_timeout' from source: unknown 28173 1726882788.34000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882788.34142: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882788.34296: variable 'omit' from source: magic vars 28173 1726882788.34305: starting attempt loop 28173 1726882788.34311: running the handler 28173 1726882788.34361: variable '__network_connections_result' from source: set_fact 28173 1726882788.34462: variable '__network_connections_result' from source: set_fact 28173 1726882788.34704: handler run complete 28173 1726882788.34768: attempt loop complete, returning result 28173 1726882788.34828: _execute() done 28173 1726882788.34833: dumping result to json 28173 1726882788.34840: done dumping result, returning 28173 1726882788.34849: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-926c-8928-0000000000ea] 28173 1726882788.34885: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ea ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 28173 1726882788.35077: no more pending results, returning what we have 28173 1726882788.35082: results queue empty 28173 1726882788.35083: checking for any_errors_fatal 28173 1726882788.35089: done checking for any_errors_fatal 28173 1726882788.35089: checking for max_fail_percentage 28173 1726882788.35091: done checking for max_fail_percentage 28173 1726882788.35092: checking to see if all hosts have failed and the running result is not ok 28173 1726882788.35093: done checking to see if all hosts have failed 28173 1726882788.35094: getting the remaining hosts for this loop 28173 1726882788.35095: done getting the remaining hosts for this loop 28173 1726882788.35099: getting the next task for host managed_node2 28173 1726882788.35106: done getting next task for host managed_node2 28173 1726882788.35110: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28173 1726882788.35112: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882788.35122: getting variables 28173 1726882788.35124: in VariableManager get_vars() 28173 1726882788.35168: Calling all_inventory to load vars for managed_node2 28173 1726882788.35171: Calling groups_inventory to load vars for managed_node2 28173 1726882788.35174: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882788.35185: Calling all_plugins_play to load vars for managed_node2 28173 1726882788.35189: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882788.35192: Calling groups_plugins_play to load vars for managed_node2 28173 1726882788.36320: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ea 28173 1726882788.36324: WORKER PROCESS EXITING 28173 1726882788.38045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882788.40148: done with get_vars() 28173 1726882788.40183: done getting variables 28173 1726882788.40244: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:39:48 -0400 (0:00:00.104) 0:00:41.567 ****** 28173 1726882788.40282: entering _queue_task() for managed_node2/debug 28173 1726882788.41282: worker is 1 (out of 1 available) 28173 1726882788.41409: exiting _queue_task() for managed_node2/debug 28173 1726882788.41421: done queuing things up, now waiting for results queue to drain 28173 1726882788.41423: waiting for pending results... 28173 1726882788.42423: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28173 1726882788.42774: in run() - task 0e448fcc-3ce9-926c-8928-0000000000eb 28173 1726882788.42889: variable 'ansible_search_path' from source: unknown 28173 1726882788.42940: variable 'ansible_search_path' from source: unknown 28173 1726882788.42984: calling self._execute() 28173 1726882788.43174: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882788.43220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882788.43253: variable 'omit' from source: magic vars 28173 1726882788.43650: variable 'ansible_distribution_major_version' from source: facts 28173 1726882788.44049: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882788.44171: variable 'network_state' from source: role '' defaults 28173 1726882788.44185: Evaluated conditional (network_state != {}): False 28173 1726882788.44192: when evaluation is False, skipping this task 28173 1726882788.44198: _execute() done 28173 1726882788.44204: dumping result to json 28173 1726882788.44210: done dumping result, returning 28173 1726882788.44221: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-926c-8928-0000000000eb] 28173 1726882788.44233: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000eb skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28173 1726882788.44384: no more pending results, returning what we have 28173 1726882788.44388: results queue empty 28173 1726882788.44389: checking for any_errors_fatal 28173 1726882788.44397: done checking for any_errors_fatal 28173 1726882788.44398: checking for max_fail_percentage 28173 1726882788.44400: done checking for max_fail_percentage 28173 1726882788.44401: checking to see if all hosts have failed and the running result is not ok 28173 1726882788.44402: done checking to see if all hosts have failed 28173 1726882788.44402: getting the remaining hosts for this loop 28173 1726882788.44404: done getting the remaining hosts for this loop 28173 1726882788.44408: getting the next task for host managed_node2 28173 1726882788.44413: done getting next task for host managed_node2 28173 1726882788.44417: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28173 1726882788.44420: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882788.44438: getting variables 28173 1726882788.44440: in VariableManager get_vars() 28173 1726882788.44481: Calling all_inventory to load vars for managed_node2 28173 1726882788.44484: Calling groups_inventory to load vars for managed_node2 28173 1726882788.44486: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882788.44499: Calling all_plugins_play to load vars for managed_node2 28173 1726882788.44502: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882788.44506: Calling groups_plugins_play to load vars for managed_node2 28173 1726882788.45475: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000eb 28173 1726882788.45479: WORKER PROCESS EXITING 28173 1726882788.46926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882788.51061: done with get_vars() 28173 1726882788.51293: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:39:48 -0400 (0:00:00.111) 0:00:41.678 ****** 28173 1726882788.51401: entering _queue_task() for managed_node2/ping 28173 1726882788.51974: worker is 1 (out of 1 available) 28173 1726882788.51987: exiting _queue_task() for managed_node2/ping 28173 1726882788.52000: done queuing things up, now waiting for results queue to drain 28173 1726882788.52001: waiting for pending results... 28173 1726882788.53337: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28173 1726882788.53982: in run() - task 0e448fcc-3ce9-926c-8928-0000000000ec 28173 1726882788.54004: variable 'ansible_search_path' from source: unknown 28173 1726882788.54010: variable 'ansible_search_path' from source: unknown 28173 1726882788.54051: calling self._execute() 28173 1726882788.54154: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882788.54167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882788.54182: variable 'omit' from source: magic vars 28173 1726882788.54553: variable 'ansible_distribution_major_version' from source: facts 28173 1726882788.55057: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882788.55070: variable 'omit' from source: magic vars 28173 1726882788.55119: variable 'omit' from source: magic vars 28173 1726882788.55158: variable 'omit' from source: magic vars 28173 1726882788.55203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882788.55240: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882788.55271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882788.55293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882788.55310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882788.55342: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882788.55351: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882788.55361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882788.55470: Set connection var ansible_pipelining to False 28173 1726882788.55484: Set connection var ansible_shell_type to sh 28173 1726882788.55498: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882788.55509: Set connection var ansible_timeout to 10 28173 1726882788.55517: Set connection var ansible_shell_executable to /bin/sh 28173 1726882788.55528: Set connection var ansible_connection to ssh 28173 1726882788.55554: variable 'ansible_shell_executable' from source: unknown 28173 1726882788.55562: variable 'ansible_connection' from source: unknown 28173 1726882788.55572: variable 'ansible_module_compression' from source: unknown 28173 1726882788.55579: variable 'ansible_shell_type' from source: unknown 28173 1726882788.55591: variable 'ansible_shell_executable' from source: unknown 28173 1726882788.55598: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882788.55605: variable 'ansible_pipelining' from source: unknown 28173 1726882788.55611: variable 'ansible_timeout' from source: unknown 28173 1726882788.55618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882788.55829: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882788.55844: variable 'omit' from source: magic vars 28173 1726882788.55854: starting attempt loop 28173 1726882788.55860: running the handler 28173 1726882788.55880: _low_level_execute_command(): starting 28173 1726882788.55892: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882788.56666: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882788.56685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.56701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.56723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.56770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.56787: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882788.56803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.56823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882788.56836: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882788.56847: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882788.56858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.56875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.56892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.56906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.56917: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882788.56930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.57011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882788.57032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882788.57049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882788.58182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882788.58931: stdout chunk (state=3): >>>/root <<< 28173 1726882788.59127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882788.59130: stdout chunk (state=3): >>><<< 28173 1726882788.59133: stderr chunk (state=3): >>><<< 28173 1726882788.59248: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882788.59254: _low_level_execute_command(): starting 28173 1726882788.59261: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882788.5915515-30004-257945904806586 `" && echo ansible-tmp-1726882788.5915515-30004-257945904806586="` echo /root/.ansible/tmp/ansible-tmp-1726882788.5915515-30004-257945904806586 `" ) && sleep 0' 28173 1726882788.61278: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882788.61293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.61310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.61330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.61381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.61394: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882788.61412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.61430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882788.61442: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882788.61454: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882788.61472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.61488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.61503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.61513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.61524: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882788.61538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.61618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882788.61638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882788.61652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882788.61986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882788.63702: stdout chunk (state=3): >>>ansible-tmp-1726882788.5915515-30004-257945904806586=/root/.ansible/tmp/ansible-tmp-1726882788.5915515-30004-257945904806586 <<< 28173 1726882788.63884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882788.63917: stderr chunk (state=3): >>><<< 28173 1726882788.63920: stdout chunk (state=3): >>><<< 28173 1726882788.64072: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882788.5915515-30004-257945904806586=/root/.ansible/tmp/ansible-tmp-1726882788.5915515-30004-257945904806586 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882788.64076: variable 'ansible_module_compression' from source: unknown 28173 1726882788.64078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28173 1726882788.64081: variable 'ansible_facts' from source: unknown 28173 1726882788.64145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882788.5915515-30004-257945904806586/AnsiballZ_ping.py 28173 1726882788.64316: Sending initial data 28173 1726882788.64319: Sent initial data (153 bytes) 28173 1726882788.65334: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882788.65347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.65361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.65386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.65533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.65545: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882788.65558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.65579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882788.65591: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882788.65606: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882788.65619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.65632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.65648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.65659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.65673: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882788.65686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.65757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882788.65844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882788.65861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882788.65994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882788.67745: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882788.67845: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882788.67942: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpu6z_jzgv /root/.ansible/tmp/ansible-tmp-1726882788.5915515-30004-257945904806586/AnsiballZ_ping.py <<< 28173 1726882788.68042: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882788.69548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882788.69674: stderr chunk (state=3): >>><<< 28173 1726882788.69677: stdout chunk (state=3): >>><<< 28173 1726882788.69679: done transferring module to remote 28173 1726882788.69685: _low_level_execute_command(): starting 28173 1726882788.69687: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882788.5915515-30004-257945904806586/ /root/.ansible/tmp/ansible-tmp-1726882788.5915515-30004-257945904806586/AnsiballZ_ping.py && sleep 0' 28173 1726882788.70919: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882788.70931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.70943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.70958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.71013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.71026: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882788.71040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.71058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882788.71078: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882788.71092: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882788.71109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.71122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.71138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.71149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.71161: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882788.71180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.71260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882788.71282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882788.71298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882788.71446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882788.73388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882788.73451: stderr chunk (state=3): >>><<< 28173 1726882788.73454: stdout chunk (state=3): >>><<< 28173 1726882788.73546: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882788.73550: _low_level_execute_command(): starting 28173 1726882788.73552: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882788.5915515-30004-257945904806586/AnsiballZ_ping.py && sleep 0' 28173 1726882788.74470: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882788.74485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.74501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.74524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.74569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.74585: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882788.74601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.74621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882788.74638: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882788.74650: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882788.74661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.74684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.74700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.74714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.74725: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882788.74737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.74828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882788.74844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882788.74861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882788.75052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882788.88011: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28173 1726882788.89099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882788.89176: stdout chunk (state=3): >>><<< 28173 1726882788.89180: stderr chunk (state=3): >>><<< 28173 1726882788.89183: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882788.89191: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882788.5915515-30004-257945904806586/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882788.89193: _low_level_execute_command(): starting 28173 1726882788.89195: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882788.5915515-30004-257945904806586/ > /dev/null 2>&1 && sleep 0' 28173 1726882788.89835: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882788.89848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882788.89862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882788.89889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.89930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882788.89943: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882788.89958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.89987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 28173 1726882788.89991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882788.89993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882788.90062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882788.90071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882788.90074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882788.90194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882788.92008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882788.92051: stderr chunk (state=3): >>><<< 28173 1726882788.92058: stdout chunk (state=3): >>><<< 28173 1726882788.92125: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882788.92129: handler run complete 28173 1726882788.92132: attempt loop complete, returning result 28173 1726882788.92135: _execute() done 28173 1726882788.92137: dumping result to json 28173 1726882788.92140: done dumping result, returning 28173 1726882788.92142: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-926c-8928-0000000000ec] 28173 1726882788.92145: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ec 28173 1726882788.92207: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ec 28173 1726882788.92209: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28173 1726882788.92267: no more pending results, returning what we have 28173 1726882788.92270: results queue empty 28173 1726882788.92271: checking for any_errors_fatal 28173 1726882788.92279: done checking for any_errors_fatal 28173 1726882788.92279: checking for max_fail_percentage 28173 1726882788.92281: done checking for max_fail_percentage 28173 1726882788.92282: checking to see if all hosts have failed and the running result is not ok 28173 1726882788.92282: done checking to see if all hosts have failed 28173 1726882788.92283: getting the remaining hosts for this loop 28173 1726882788.92285: done getting the remaining hosts for this loop 28173 1726882788.92288: getting the next task for host managed_node2 28173 1726882788.92295: done getting next task for host managed_node2 28173 1726882788.92297: ^ task is: TASK: meta (role_complete) 28173 1726882788.92300: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882788.92308: getting variables 28173 1726882788.92310: in VariableManager get_vars() 28173 1726882788.92350: Calling all_inventory to load vars for managed_node2 28173 1726882788.92353: Calling groups_inventory to load vars for managed_node2 28173 1726882788.92355: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882788.92366: Calling all_plugins_play to load vars for managed_node2 28173 1726882788.92369: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882788.92372: Calling groups_plugins_play to load vars for managed_node2 28173 1726882788.94449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882788.96002: done with get_vars() 28173 1726882788.96023: done getting variables 28173 1726882788.96085: done queuing things up, now waiting for results queue to drain 28173 1726882788.96087: results queue empty 28173 1726882788.96087: checking for any_errors_fatal 28173 1726882788.96089: done checking for any_errors_fatal 28173 1726882788.96090: checking for max_fail_percentage 28173 1726882788.96091: done checking for max_fail_percentage 28173 1726882788.96091: checking to see if all hosts have failed and the running result is not ok 28173 1726882788.96092: done checking to see if all hosts have failed 28173 1726882788.96092: getting the remaining hosts for this loop 28173 1726882788.96093: done getting the remaining hosts for this loop 28173 1726882788.96095: getting the next task for host managed_node2 28173 1726882788.96097: done getting next task for host managed_node2 28173 1726882788.96098: ^ task is: TASK: meta (flush_handlers) 28173 1726882788.96099: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882788.96104: getting variables 28173 1726882788.96105: in VariableManager get_vars() 28173 1726882788.96114: Calling all_inventory to load vars for managed_node2 28173 1726882788.96115: Calling groups_inventory to load vars for managed_node2 28173 1726882788.96116: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882788.96120: Calling all_plugins_play to load vars for managed_node2 28173 1726882788.96121: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882788.96123: Calling groups_plugins_play to load vars for managed_node2 28173 1726882788.97194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882788.99246: done with get_vars() 28173 1726882788.99276: done getting variables 28173 1726882788.99366: in VariableManager get_vars() 28173 1726882788.99380: Calling all_inventory to load vars for managed_node2 28173 1726882788.99382: Calling groups_inventory to load vars for managed_node2 28173 1726882788.99385: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882788.99390: Calling all_plugins_play to load vars for managed_node2 28173 1726882788.99394: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882788.99400: Calling groups_plugins_play to load vars for managed_node2 28173 1726882789.00728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882789.02515: done with get_vars() 28173 1726882789.02549: done queuing things up, now waiting for results queue to drain 28173 1726882789.02551: results queue empty 28173 1726882789.02552: checking for any_errors_fatal 28173 1726882789.02553: done checking for any_errors_fatal 28173 1726882789.02554: checking for max_fail_percentage 28173 1726882789.02555: done checking for max_fail_percentage 28173 1726882789.02556: checking to see if all hosts have failed and the running result is not ok 28173 1726882789.02557: done checking to see if all hosts have failed 28173 1726882789.02557: getting the remaining hosts for this loop 28173 1726882789.02558: done getting the remaining hosts for this loop 28173 1726882789.02561: getting the next task for host managed_node2 28173 1726882789.02567: done getting next task for host managed_node2 28173 1726882789.02568: ^ task is: TASK: meta (flush_handlers) 28173 1726882789.02570: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882789.02573: getting variables 28173 1726882789.02574: in VariableManager get_vars() 28173 1726882789.02591: Calling all_inventory to load vars for managed_node2 28173 1726882789.02594: Calling groups_inventory to load vars for managed_node2 28173 1726882789.02596: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882789.02602: Calling all_plugins_play to load vars for managed_node2 28173 1726882789.02604: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882789.02607: Calling groups_plugins_play to load vars for managed_node2 28173 1726882789.03996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882789.05769: done with get_vars() 28173 1726882789.05798: done getting variables 28173 1726882789.05857: in VariableManager get_vars() 28173 1726882789.05874: Calling all_inventory to load vars for managed_node2 28173 1726882789.05878: Calling groups_inventory to load vars for managed_node2 28173 1726882789.05880: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882789.05886: Calling all_plugins_play to load vars for managed_node2 28173 1726882789.05888: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882789.05891: Calling groups_plugins_play to load vars for managed_node2 28173 1726882789.07313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882789.09120: done with get_vars() 28173 1726882789.09148: done queuing things up, now waiting for results queue to drain 28173 1726882789.09150: results queue empty 28173 1726882789.09151: checking for any_errors_fatal 28173 1726882789.09152: done checking for any_errors_fatal 28173 1726882789.09153: checking for max_fail_percentage 28173 1726882789.09154: done checking for max_fail_percentage 28173 1726882789.09155: checking to see if all hosts have failed and the running result is not ok 28173 1726882789.09156: done checking to see if all hosts have failed 28173 1726882789.09156: getting the remaining hosts for this loop 28173 1726882789.09157: done getting the remaining hosts for this loop 28173 1726882789.09160: getting the next task for host managed_node2 28173 1726882789.09166: done getting next task for host managed_node2 28173 1726882789.09167: ^ task is: None 28173 1726882789.09169: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882789.09170: done queuing things up, now waiting for results queue to drain 28173 1726882789.09171: results queue empty 28173 1726882789.09172: checking for any_errors_fatal 28173 1726882789.09173: done checking for any_errors_fatal 28173 1726882789.09173: checking for max_fail_percentage 28173 1726882789.09174: done checking for max_fail_percentage 28173 1726882789.09175: checking to see if all hosts have failed and the running result is not ok 28173 1726882789.09176: done checking to see if all hosts have failed 28173 1726882789.09177: getting the next task for host managed_node2 28173 1726882789.09180: done getting next task for host managed_node2 28173 1726882789.09181: ^ task is: None 28173 1726882789.09182: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882789.09232: in VariableManager get_vars() 28173 1726882789.09247: done with get_vars() 28173 1726882789.09253: in VariableManager get_vars() 28173 1726882789.09262: done with get_vars() 28173 1726882789.09269: variable 'omit' from source: magic vars 28173 1726882789.09300: in VariableManager get_vars() 28173 1726882789.09311: done with get_vars() 28173 1726882789.09336: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 28173 1726882789.09598: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28173 1726882789.09622: getting the remaining hosts for this loop 28173 1726882789.09623: done getting the remaining hosts for this loop 28173 1726882789.09626: getting the next task for host managed_node2 28173 1726882789.09628: done getting next task for host managed_node2 28173 1726882789.09631: ^ task is: TASK: Gathering Facts 28173 1726882789.09632: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882789.09634: getting variables 28173 1726882789.09635: in VariableManager get_vars() 28173 1726882789.09643: Calling all_inventory to load vars for managed_node2 28173 1726882789.09645: Calling groups_inventory to load vars for managed_node2 28173 1726882789.09649: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882789.09660: Calling all_plugins_play to load vars for managed_node2 28173 1726882789.09662: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882789.09669: Calling groups_plugins_play to load vars for managed_node2 28173 1726882789.10995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882789.12027: done with get_vars() 28173 1726882789.12043: done getting variables 28173 1726882789.12075: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:149 Friday 20 September 2024 21:39:49 -0400 (0:00:00.606) 0:00:42.285 ****** 28173 1726882789.12093: entering _queue_task() for managed_node2/gather_facts 28173 1726882789.12321: worker is 1 (out of 1 available) 28173 1726882789.12334: exiting _queue_task() for managed_node2/gather_facts 28173 1726882789.12345: done queuing things up, now waiting for results queue to drain 28173 1726882789.12347: waiting for pending results... 28173 1726882789.12530: running TaskExecutor() for managed_node2/TASK: Gathering Facts 28173 1726882789.12603: in run() - task 0e448fcc-3ce9-926c-8928-00000000085b 28173 1726882789.12615: variable 'ansible_search_path' from source: unknown 28173 1726882789.12645: calling self._execute() 28173 1726882789.12724: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882789.12728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882789.12737: variable 'omit' from source: magic vars 28173 1726882789.13138: variable 'ansible_distribution_major_version' from source: facts 28173 1726882789.13159: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882789.13175: variable 'omit' from source: magic vars 28173 1726882789.13206: variable 'omit' from source: magic vars 28173 1726882789.13253: variable 'omit' from source: magic vars 28173 1726882789.13301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882789.13345: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882789.13380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882789.13401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882789.13417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882789.13452: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882789.13471: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882789.13481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882789.13589: Set connection var ansible_pipelining to False 28173 1726882789.13596: Set connection var ansible_shell_type to sh 28173 1726882789.13607: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882789.13618: Set connection var ansible_timeout to 10 28173 1726882789.13627: Set connection var ansible_shell_executable to /bin/sh 28173 1726882789.13635: Set connection var ansible_connection to ssh 28173 1726882789.13658: variable 'ansible_shell_executable' from source: unknown 28173 1726882789.13670: variable 'ansible_connection' from source: unknown 28173 1726882789.13684: variable 'ansible_module_compression' from source: unknown 28173 1726882789.13692: variable 'ansible_shell_type' from source: unknown 28173 1726882789.13699: variable 'ansible_shell_executable' from source: unknown 28173 1726882789.13706: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882789.13713: variable 'ansible_pipelining' from source: unknown 28173 1726882789.13720: variable 'ansible_timeout' from source: unknown 28173 1726882789.13728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882789.13934: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882789.13951: variable 'omit' from source: magic vars 28173 1726882789.13961: starting attempt loop 28173 1726882789.13973: running the handler 28173 1726882789.13998: variable 'ansible_facts' from source: unknown 28173 1726882789.14036: _low_level_execute_command(): starting 28173 1726882789.14048: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882789.14933: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882789.14957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882789.14995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882789.15026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882789.15123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882789.15134: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882789.15146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882789.15169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882789.15182: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882789.15200: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882789.15246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882789.15249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882789.15252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882789.15316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882789.15324: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882789.15327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882789.15454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882789.17157: stdout chunk (state=3): >>>/root <<< 28173 1726882789.17271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882789.17348: stderr chunk (state=3): >>><<< 28173 1726882789.17809: stdout chunk (state=3): >>><<< 28173 1726882789.17928: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882789.17931: _low_level_execute_command(): starting 28173 1726882789.17934: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882789.178372-30043-218147391457892 `" && echo ansible-tmp-1726882789.178372-30043-218147391457892="` echo /root/.ansible/tmp/ansible-tmp-1726882789.178372-30043-218147391457892 `" ) && sleep 0' 28173 1726882789.18489: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882789.18511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882789.18525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882789.18543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882789.18587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882789.18598: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882789.18610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882789.18626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882789.18636: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882789.18646: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882789.18657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882789.18675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882789.18692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882789.18703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882789.18713: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882789.18725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882789.18801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882789.18823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882789.18839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882789.18974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882789.20872: stdout chunk (state=3): >>>ansible-tmp-1726882789.178372-30043-218147391457892=/root/.ansible/tmp/ansible-tmp-1726882789.178372-30043-218147391457892 <<< 28173 1726882789.21062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882789.21069: stdout chunk (state=3): >>><<< 28173 1726882789.21071: stderr chunk (state=3): >>><<< 28173 1726882789.21480: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882789.178372-30043-218147391457892=/root/.ansible/tmp/ansible-tmp-1726882789.178372-30043-218147391457892 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882789.21483: variable 'ansible_module_compression' from source: unknown 28173 1726882789.21486: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28173 1726882789.21488: variable 'ansible_facts' from source: unknown 28173 1726882789.21489: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882789.178372-30043-218147391457892/AnsiballZ_setup.py 28173 1726882789.21699: Sending initial data 28173 1726882789.21707: Sent initial data (153 bytes) 28173 1726882789.23490: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882789.24581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882789.24597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882789.24616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882789.24658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882789.24675: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882789.24691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882789.24709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882789.24721: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882789.24732: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882789.24744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882789.24759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882789.24777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882789.24789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882789.24801: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882789.24814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882789.24888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882789.24912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882789.24929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882789.25066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882789.26903: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 28173 1726882789.26907: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882789.26996: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882789.27096: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpv9d6b8te /root/.ansible/tmp/ansible-tmp-1726882789.178372-30043-218147391457892/AnsiballZ_setup.py <<< 28173 1726882789.27193: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882789.30340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882789.30610: stderr chunk (state=3): >>><<< 28173 1726882789.30613: stdout chunk (state=3): >>><<< 28173 1726882789.30616: done transferring module to remote 28173 1726882789.30618: _low_level_execute_command(): starting 28173 1726882789.30624: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882789.178372-30043-218147391457892/ /root/.ansible/tmp/ansible-tmp-1726882789.178372-30043-218147391457892/AnsiballZ_setup.py && sleep 0' 28173 1726882789.32334: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882789.32420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882789.32437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882789.32457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882789.32534: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882789.32584: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882789.32605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882789.32623: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882789.32662: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882789.32682: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882789.32696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882789.32717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882789.32734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882789.32746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882789.32757: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882789.32776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882789.32969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882789.32987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882789.33001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882789.33134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882789.34985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882789.35050: stderr chunk (state=3): >>><<< 28173 1726882789.35054: stdout chunk (state=3): >>><<< 28173 1726882789.35154: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882789.35158: _low_level_execute_command(): starting 28173 1726882789.35160: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882789.178372-30043-218147391457892/AnsiballZ_setup.py && sleep 0' 28173 1726882789.36690: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882789.36781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882789.36799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882789.36819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882789.36860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882789.36920: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882789.36934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882789.36951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882789.36965: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882789.36979: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882789.36990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882789.37001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882789.37021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882789.37036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882789.37045: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882789.37057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882789.37260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882789.37281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882789.37295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882789.37480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882789.90289: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_pytho<<< 28173 1726882789.90306: stdout chunk (state=3): >>>n_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2812, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 720, "free": 2812}, "nocache": {"free": 3276, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version<<< 28173 1726882789.90341: stdout chunk (state=3): >>>": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 728, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264234229760, "block_size": 4096, "block_total": 65519355, "block_available": 64510310, "block_used": 1009045, "inode_total": 131071472, "inode_available": 130998692, "inode_used": 72780, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "49", "epoch": "1726882789", "epoch_int": "1726882789", "date": "2024-09-20", "time": "21:39:49", "iso8601_micro": "2024-09-21T01:39:49.849826Z", "iso8601": "2024-09-21T01:39:49Z", "iso8601_basic": "20240920T213949849826", "iso8601_basic_short": "20240920T213949", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.47, "5m": 0.42, "15m": 0.26}, "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmen<<< 28173 1726882789.90365: stdout chunk (state=3): >>>tation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum<<< 28173 1726882789.90371: stdout chunk (state=3): >>>_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28173 1726882789.91939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882789.91997: stderr chunk (state=3): >>><<< 28173 1726882789.92000: stdout chunk (state=3): >>><<< 28173 1726882789.92038: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2812, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 720, "free": 2812}, "nocache": {"free": 3276, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 728, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264234229760, "block_size": 4096, "block_total": 65519355, "block_available": 64510310, "block_used": 1009045, "inode_total": 131071472, "inode_available": 130998692, "inode_used": 72780, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "49", "epoch": "1726882789", "epoch_int": "1726882789", "date": "2024-09-20", "time": "21:39:49", "iso8601_micro": "2024-09-21T01:39:49.849826Z", "iso8601": "2024-09-21T01:39:49Z", "iso8601_basic": "20240920T213949849826", "iso8601_basic_short": "20240920T213949", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.47, "5m": 0.42, "15m": 0.26}, "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882789.92391: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882789.178372-30043-218147391457892/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882789.92409: _low_level_execute_command(): starting 28173 1726882789.92413: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882789.178372-30043-218147391457892/ > /dev/null 2>&1 && sleep 0' 28173 1726882789.92853: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882789.92871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882789.92886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 28173 1726882789.92899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882789.92942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882789.92954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882789.93066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882789.94920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882789.94999: stderr chunk (state=3): >>><<< 28173 1726882789.95008: stdout chunk (state=3): >>><<< 28173 1726882789.95034: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882789.95050: handler run complete 28173 1726882789.95316: variable 'ansible_facts' from source: unknown 28173 1726882789.95440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882789.95669: variable 'ansible_facts' from source: unknown 28173 1726882789.95729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882789.95813: attempt loop complete, returning result 28173 1726882789.95816: _execute() done 28173 1726882789.95818: dumping result to json 28173 1726882789.95839: done dumping result, returning 28173 1726882789.95846: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-926c-8928-00000000085b] 28173 1726882789.95852: sending task result for task 0e448fcc-3ce9-926c-8928-00000000085b 28173 1726882789.96259: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000085b 28173 1726882789.96262: WORKER PROCESS EXITING ok: [managed_node2] 28173 1726882789.96446: no more pending results, returning what we have 28173 1726882789.96448: results queue empty 28173 1726882789.96448: checking for any_errors_fatal 28173 1726882789.96449: done checking for any_errors_fatal 28173 1726882789.96449: checking for max_fail_percentage 28173 1726882789.96450: done checking for max_fail_percentage 28173 1726882789.96451: checking to see if all hosts have failed and the running result is not ok 28173 1726882789.96452: done checking to see if all hosts have failed 28173 1726882789.96452: getting the remaining hosts for this loop 28173 1726882789.96453: done getting the remaining hosts for this loop 28173 1726882789.96455: getting the next task for host managed_node2 28173 1726882789.96458: done getting next task for host managed_node2 28173 1726882789.96459: ^ task is: TASK: meta (flush_handlers) 28173 1726882789.96461: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882789.96465: getting variables 28173 1726882789.96468: in VariableManager get_vars() 28173 1726882789.96485: Calling all_inventory to load vars for managed_node2 28173 1726882789.96488: Calling groups_inventory to load vars for managed_node2 28173 1726882789.96491: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882789.96499: Calling all_plugins_play to load vars for managed_node2 28173 1726882789.96500: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882789.96502: Calling groups_plugins_play to load vars for managed_node2 28173 1726882789.97298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882789.99206: done with get_vars() 28173 1726882789.99229: done getting variables 28173 1726882789.99301: in VariableManager get_vars() 28173 1726882789.99310: Calling all_inventory to load vars for managed_node2 28173 1726882789.99312: Calling groups_inventory to load vars for managed_node2 28173 1726882789.99314: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882789.99319: Calling all_plugins_play to load vars for managed_node2 28173 1726882789.99321: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882789.99324: Calling groups_plugins_play to load vars for managed_node2 28173 1726882790.00691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882790.02533: done with get_vars() 28173 1726882790.02575: done queuing things up, now waiting for results queue to drain 28173 1726882790.02577: results queue empty 28173 1726882790.02578: checking for any_errors_fatal 28173 1726882790.02582: done checking for any_errors_fatal 28173 1726882790.02583: checking for max_fail_percentage 28173 1726882790.02584: done checking for max_fail_percentage 28173 1726882790.02585: checking to see if all hosts have failed and the running result is not ok 28173 1726882790.02590: done checking to see if all hosts have failed 28173 1726882790.02591: getting the remaining hosts for this loop 28173 1726882790.02592: done getting the remaining hosts for this loop 28173 1726882790.02595: getting the next task for host managed_node2 28173 1726882790.02607: done getting next task for host managed_node2 28173 1726882790.02609: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 28173 1726882790.02611: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882790.02614: getting variables 28173 1726882790.02615: in VariableManager get_vars() 28173 1726882790.02625: Calling all_inventory to load vars for managed_node2 28173 1726882790.02627: Calling groups_inventory to load vars for managed_node2 28173 1726882790.02629: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882790.02641: Calling all_plugins_play to load vars for managed_node2 28173 1726882790.02645: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882790.02648: Calling groups_plugins_play to load vars for managed_node2 28173 1726882790.04206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882790.06015: done with get_vars() 28173 1726882790.06036: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:152 Friday 20 September 2024 21:39:50 -0400 (0:00:00.940) 0:00:43.225 ****** 28173 1726882790.06114: entering _queue_task() for managed_node2/include_tasks 28173 1726882790.06445: worker is 1 (out of 1 available) 28173 1726882790.06459: exiting _queue_task() for managed_node2/include_tasks 28173 1726882790.06475: done queuing things up, now waiting for results queue to drain 28173 1726882790.06476: waiting for pending results... 28173 1726882790.06766: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_absent.yml' 28173 1726882790.06887: in run() - task 0e448fcc-3ce9-926c-8928-0000000000ef 28173 1726882790.06910: variable 'ansible_search_path' from source: unknown 28173 1726882790.06957: calling self._execute() 28173 1726882790.07055: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882790.07069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882790.07085: variable 'omit' from source: magic vars 28173 1726882790.07526: variable 'ansible_distribution_major_version' from source: facts 28173 1726882790.07545: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882790.07556: _execute() done 28173 1726882790.07565: dumping result to json 28173 1726882790.07578: done dumping result, returning 28173 1726882790.07588: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_absent.yml' [0e448fcc-3ce9-926c-8928-0000000000ef] 28173 1726882790.07619: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ef 28173 1726882790.07770: no more pending results, returning what we have 28173 1726882790.07777: in VariableManager get_vars() 28173 1726882790.07815: Calling all_inventory to load vars for managed_node2 28173 1726882790.07818: Calling groups_inventory to load vars for managed_node2 28173 1726882790.07822: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882790.07837: Calling all_plugins_play to load vars for managed_node2 28173 1726882790.07842: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882790.07845: Calling groups_plugins_play to load vars for managed_node2 28173 1726882790.09165: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000ef 28173 1726882790.09174: WORKER PROCESS EXITING 28173 1726882790.09987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882790.11761: done with get_vars() 28173 1726882790.11785: variable 'ansible_search_path' from source: unknown 28173 1726882790.11799: we have included files to process 28173 1726882790.11800: generating all_blocks data 28173 1726882790.11802: done generating all_blocks data 28173 1726882790.11803: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28173 1726882790.11804: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28173 1726882790.11806: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28173 1726882790.11972: in VariableManager get_vars() 28173 1726882790.11989: done with get_vars() 28173 1726882790.12099: done processing included file 28173 1726882790.12101: iterating over new_blocks loaded from include file 28173 1726882790.12103: in VariableManager get_vars() 28173 1726882790.12114: done with get_vars() 28173 1726882790.12115: filtering new block on tags 28173 1726882790.12131: done filtering new block on tags 28173 1726882790.12133: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 28173 1726882790.12138: extending task lists for all hosts with included blocks 28173 1726882790.12206: done extending task lists 28173 1726882790.12207: done processing included files 28173 1726882790.12208: results queue empty 28173 1726882790.12209: checking for any_errors_fatal 28173 1726882790.12210: done checking for any_errors_fatal 28173 1726882790.12211: checking for max_fail_percentage 28173 1726882790.12212: done checking for max_fail_percentage 28173 1726882790.12213: checking to see if all hosts have failed and the running result is not ok 28173 1726882790.12214: done checking to see if all hosts have failed 28173 1726882790.12215: getting the remaining hosts for this loop 28173 1726882790.12216: done getting the remaining hosts for this loop 28173 1726882790.12218: getting the next task for host managed_node2 28173 1726882790.12222: done getting next task for host managed_node2 28173 1726882790.12224: ^ task is: TASK: Include the task 'get_profile_stat.yml' 28173 1726882790.12226: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882790.12229: getting variables 28173 1726882790.12230: in VariableManager get_vars() 28173 1726882790.12237: Calling all_inventory to load vars for managed_node2 28173 1726882790.12240: Calling groups_inventory to load vars for managed_node2 28173 1726882790.12242: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882790.12247: Calling all_plugins_play to load vars for managed_node2 28173 1726882790.12249: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882790.12252: Calling groups_plugins_play to load vars for managed_node2 28173 1726882790.18101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882790.19762: done with get_vars() 28173 1726882790.19782: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:39:50 -0400 (0:00:00.137) 0:00:43.362 ****** 28173 1726882790.19829: entering _queue_task() for managed_node2/include_tasks 28173 1726882790.20060: worker is 1 (out of 1 available) 28173 1726882790.20075: exiting _queue_task() for managed_node2/include_tasks 28173 1726882790.20086: done queuing things up, now waiting for results queue to drain 28173 1726882790.20087: waiting for pending results... 28173 1726882790.20271: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 28173 1726882790.20359: in run() - task 0e448fcc-3ce9-926c-8928-00000000086c 28173 1726882790.20376: variable 'ansible_search_path' from source: unknown 28173 1726882790.20379: variable 'ansible_search_path' from source: unknown 28173 1726882790.20407: calling self._execute() 28173 1726882790.20488: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882790.20492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882790.20500: variable 'omit' from source: magic vars 28173 1726882790.20787: variable 'ansible_distribution_major_version' from source: facts 28173 1726882790.20798: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882790.20803: _execute() done 28173 1726882790.20807: dumping result to json 28173 1726882790.20809: done dumping result, returning 28173 1726882790.20815: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-926c-8928-00000000086c] 28173 1726882790.20820: sending task result for task 0e448fcc-3ce9-926c-8928-00000000086c 28173 1726882790.20911: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000086c 28173 1726882790.20913: WORKER PROCESS EXITING 28173 1726882790.20938: no more pending results, returning what we have 28173 1726882790.20942: in VariableManager get_vars() 28173 1726882790.20982: Calling all_inventory to load vars for managed_node2 28173 1726882790.20984: Calling groups_inventory to load vars for managed_node2 28173 1726882790.20988: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882790.21000: Calling all_plugins_play to load vars for managed_node2 28173 1726882790.21002: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882790.21005: Calling groups_plugins_play to load vars for managed_node2 28173 1726882790.22433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882790.23428: done with get_vars() 28173 1726882790.23441: variable 'ansible_search_path' from source: unknown 28173 1726882790.23442: variable 'ansible_search_path' from source: unknown 28173 1726882790.23470: we have included files to process 28173 1726882790.23471: generating all_blocks data 28173 1726882790.23472: done generating all_blocks data 28173 1726882790.23473: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28173 1726882790.23474: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28173 1726882790.23476: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28173 1726882790.24160: done processing included file 28173 1726882790.24162: iterating over new_blocks loaded from include file 28173 1726882790.24163: in VariableManager get_vars() 28173 1726882790.24175: done with get_vars() 28173 1726882790.24176: filtering new block on tags 28173 1726882790.24190: done filtering new block on tags 28173 1726882790.24192: in VariableManager get_vars() 28173 1726882790.24198: done with get_vars() 28173 1726882790.24199: filtering new block on tags 28173 1726882790.24213: done filtering new block on tags 28173 1726882790.24215: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 28173 1726882790.24218: extending task lists for all hosts with included blocks 28173 1726882790.24396: done extending task lists 28173 1726882790.24397: done processing included files 28173 1726882790.24397: results queue empty 28173 1726882790.24397: checking for any_errors_fatal 28173 1726882790.24399: done checking for any_errors_fatal 28173 1726882790.24400: checking for max_fail_percentage 28173 1726882790.24400: done checking for max_fail_percentage 28173 1726882790.24401: checking to see if all hosts have failed and the running result is not ok 28173 1726882790.24401: done checking to see if all hosts have failed 28173 1726882790.24402: getting the remaining hosts for this loop 28173 1726882790.24403: done getting the remaining hosts for this loop 28173 1726882790.24404: getting the next task for host managed_node2 28173 1726882790.24407: done getting next task for host managed_node2 28173 1726882790.24408: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 28173 1726882790.24410: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882790.24412: getting variables 28173 1726882790.24413: in VariableManager get_vars() 28173 1726882790.24418: Calling all_inventory to load vars for managed_node2 28173 1726882790.24419: Calling groups_inventory to load vars for managed_node2 28173 1726882790.24421: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882790.24424: Calling all_plugins_play to load vars for managed_node2 28173 1726882790.24430: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882790.24433: Calling groups_plugins_play to load vars for managed_node2 28173 1726882790.25645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882790.26949: done with get_vars() 28173 1726882790.26963: done getting variables 28173 1726882790.26992: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:39:50 -0400 (0:00:00.071) 0:00:43.434 ****** 28173 1726882790.27013: entering _queue_task() for managed_node2/set_fact 28173 1726882790.27229: worker is 1 (out of 1 available) 28173 1726882790.27240: exiting _queue_task() for managed_node2/set_fact 28173 1726882790.27251: done queuing things up, now waiting for results queue to drain 28173 1726882790.27252: waiting for pending results... 28173 1726882790.27434: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 28173 1726882790.27508: in run() - task 0e448fcc-3ce9-926c-8928-00000000087b 28173 1726882790.27522: variable 'ansible_search_path' from source: unknown 28173 1726882790.27525: variable 'ansible_search_path' from source: unknown 28173 1726882790.27552: calling self._execute() 28173 1726882790.27635: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882790.27639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882790.27648: variable 'omit' from source: magic vars 28173 1726882790.27924: variable 'ansible_distribution_major_version' from source: facts 28173 1726882790.27935: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882790.27941: variable 'omit' from source: magic vars 28173 1726882790.27973: variable 'omit' from source: magic vars 28173 1726882790.27999: variable 'omit' from source: magic vars 28173 1726882790.28029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882790.28054: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882790.28073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882790.28087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882790.28097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882790.28125: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882790.28128: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882790.28133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882790.28228: Set connection var ansible_pipelining to False 28173 1726882790.28232: Set connection var ansible_shell_type to sh 28173 1726882790.28234: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882790.28243: Set connection var ansible_timeout to 10 28173 1726882790.28250: Set connection var ansible_shell_executable to /bin/sh 28173 1726882790.28253: Set connection var ansible_connection to ssh 28173 1726882790.28277: variable 'ansible_shell_executable' from source: unknown 28173 1726882790.28282: variable 'ansible_connection' from source: unknown 28173 1726882790.28299: variable 'ansible_module_compression' from source: unknown 28173 1726882790.28309: variable 'ansible_shell_type' from source: unknown 28173 1726882790.28312: variable 'ansible_shell_executable' from source: unknown 28173 1726882790.28314: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882790.28316: variable 'ansible_pipelining' from source: unknown 28173 1726882790.28318: variable 'ansible_timeout' from source: unknown 28173 1726882790.28321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882790.28436: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882790.28447: variable 'omit' from source: magic vars 28173 1726882790.28452: starting attempt loop 28173 1726882790.28454: running the handler 28173 1726882790.28470: handler run complete 28173 1726882790.28486: attempt loop complete, returning result 28173 1726882790.28494: _execute() done 28173 1726882790.28502: dumping result to json 28173 1726882790.28509: done dumping result, returning 28173 1726882790.28518: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-926c-8928-00000000087b] 28173 1726882790.28523: sending task result for task 0e448fcc-3ce9-926c-8928-00000000087b 28173 1726882790.28643: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000087b 28173 1726882790.28646: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 28173 1726882790.28704: no more pending results, returning what we have 28173 1726882790.28708: results queue empty 28173 1726882790.28709: checking for any_errors_fatal 28173 1726882790.28711: done checking for any_errors_fatal 28173 1726882790.28712: checking for max_fail_percentage 28173 1726882790.28713: done checking for max_fail_percentage 28173 1726882790.28714: checking to see if all hosts have failed and the running result is not ok 28173 1726882790.28715: done checking to see if all hosts have failed 28173 1726882790.28716: getting the remaining hosts for this loop 28173 1726882790.28717: done getting the remaining hosts for this loop 28173 1726882790.28721: getting the next task for host managed_node2 28173 1726882790.28727: done getting next task for host managed_node2 28173 1726882790.28729: ^ task is: TASK: Stat profile file 28173 1726882790.28736: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882790.28742: getting variables 28173 1726882790.28743: in VariableManager get_vars() 28173 1726882790.28773: Calling all_inventory to load vars for managed_node2 28173 1726882790.28776: Calling groups_inventory to load vars for managed_node2 28173 1726882790.28779: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882790.28794: Calling all_plugins_play to load vars for managed_node2 28173 1726882790.28798: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882790.28801: Calling groups_plugins_play to load vars for managed_node2 28173 1726882790.30281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882790.31844: done with get_vars() 28173 1726882790.31858: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:39:50 -0400 (0:00:00.049) 0:00:43.483 ****** 28173 1726882790.31922: entering _queue_task() for managed_node2/stat 28173 1726882790.32107: worker is 1 (out of 1 available) 28173 1726882790.32118: exiting _queue_task() for managed_node2/stat 28173 1726882790.32129: done queuing things up, now waiting for results queue to drain 28173 1726882790.32130: waiting for pending results... 28173 1726882790.32303: running TaskExecutor() for managed_node2/TASK: Stat profile file 28173 1726882790.32378: in run() - task 0e448fcc-3ce9-926c-8928-00000000087c 28173 1726882790.32389: variable 'ansible_search_path' from source: unknown 28173 1726882790.32393: variable 'ansible_search_path' from source: unknown 28173 1726882790.32419: calling self._execute() 28173 1726882790.32501: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882790.32505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882790.32514: variable 'omit' from source: magic vars 28173 1726882790.32782: variable 'ansible_distribution_major_version' from source: facts 28173 1726882790.32793: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882790.32799: variable 'omit' from source: magic vars 28173 1726882790.32831: variable 'omit' from source: magic vars 28173 1726882790.32902: variable 'profile' from source: include params 28173 1726882790.32906: variable 'interface' from source: set_fact 28173 1726882790.32958: variable 'interface' from source: set_fact 28173 1726882790.32975: variable 'omit' from source: magic vars 28173 1726882790.33006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882790.33034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882790.33049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882790.33062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882790.33076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882790.33105: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882790.33108: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882790.33110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882790.33178: Set connection var ansible_pipelining to False 28173 1726882790.33181: Set connection var ansible_shell_type to sh 28173 1726882790.33187: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882790.33195: Set connection var ansible_timeout to 10 28173 1726882790.33197: Set connection var ansible_shell_executable to /bin/sh 28173 1726882790.33203: Set connection var ansible_connection to ssh 28173 1726882790.33220: variable 'ansible_shell_executable' from source: unknown 28173 1726882790.33223: variable 'ansible_connection' from source: unknown 28173 1726882790.33225: variable 'ansible_module_compression' from source: unknown 28173 1726882790.33232: variable 'ansible_shell_type' from source: unknown 28173 1726882790.33234: variable 'ansible_shell_executable' from source: unknown 28173 1726882790.33237: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882790.33241: variable 'ansible_pipelining' from source: unknown 28173 1726882790.33243: variable 'ansible_timeout' from source: unknown 28173 1726882790.33247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882790.33484: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882790.33502: variable 'omit' from source: magic vars 28173 1726882790.33512: starting attempt loop 28173 1726882790.33519: running the handler 28173 1726882790.33535: _low_level_execute_command(): starting 28173 1726882790.33547: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882790.34319: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882790.34345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882790.34376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.34393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882790.34427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.34430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.34432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.34514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882790.34521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882790.34630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882790.36295: stdout chunk (state=3): >>>/root <<< 28173 1726882790.36397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882790.36439: stderr chunk (state=3): >>><<< 28173 1726882790.36442: stdout chunk (state=3): >>><<< 28173 1726882790.36460: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882790.36473: _low_level_execute_command(): starting 28173 1726882790.36480: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882790.3645916-30098-58573263251671 `" && echo ansible-tmp-1726882790.3645916-30098-58573263251671="` echo /root/.ansible/tmp/ansible-tmp-1726882790.3645916-30098-58573263251671 `" ) && sleep 0' 28173 1726882790.36901: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.36904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882790.36932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 28173 1726882790.36936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.36992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882790.36996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882790.37101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882790.38969: stdout chunk (state=3): >>>ansible-tmp-1726882790.3645916-30098-58573263251671=/root/.ansible/tmp/ansible-tmp-1726882790.3645916-30098-58573263251671 <<< 28173 1726882790.39081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882790.39123: stderr chunk (state=3): >>><<< 28173 1726882790.39126: stdout chunk (state=3): >>><<< 28173 1726882790.39138: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882790.3645916-30098-58573263251671=/root/.ansible/tmp/ansible-tmp-1726882790.3645916-30098-58573263251671 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882790.39175: variable 'ansible_module_compression' from source: unknown 28173 1726882790.39224: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28173 1726882790.39256: variable 'ansible_facts' from source: unknown 28173 1726882790.39308: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882790.3645916-30098-58573263251671/AnsiballZ_stat.py 28173 1726882790.39406: Sending initial data 28173 1726882790.39416: Sent initial data (152 bytes) 28173 1726882790.40036: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.40040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882790.40071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.40075: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.40082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.40132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882790.40135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882790.40237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882790.41961: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882790.42058: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882790.42157: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpg3847lkm /root/.ansible/tmp/ansible-tmp-1726882790.3645916-30098-58573263251671/AnsiballZ_stat.py <<< 28173 1726882790.42251: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882790.43267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882790.43352: stderr chunk (state=3): >>><<< 28173 1726882790.43356: stdout chunk (state=3): >>><<< 28173 1726882790.43374: done transferring module to remote 28173 1726882790.43382: _low_level_execute_command(): starting 28173 1726882790.43387: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882790.3645916-30098-58573263251671/ /root/.ansible/tmp/ansible-tmp-1726882790.3645916-30098-58573263251671/AnsiballZ_stat.py && sleep 0' 28173 1726882790.43794: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882790.43798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882790.43831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.43833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.43836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.43883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882790.43895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882790.43998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882790.45739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882790.45782: stderr chunk (state=3): >>><<< 28173 1726882790.45789: stdout chunk (state=3): >>><<< 28173 1726882790.45803: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882790.45807: _low_level_execute_command(): starting 28173 1726882790.45810: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882790.3645916-30098-58573263251671/AnsiballZ_stat.py && sleep 0' 28173 1726882790.46218: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882790.46234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882790.46251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.46262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.46311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882790.46319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882790.46439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882790.59457: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28173 1726882790.60428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882790.60512: stderr chunk (state=3): >>><<< 28173 1726882790.60515: stdout chunk (state=3): >>><<< 28173 1726882790.60620: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882790.60625: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882790.3645916-30098-58573263251671/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882790.60627: _low_level_execute_command(): starting 28173 1726882790.60629: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882790.3645916-30098-58573263251671/ > /dev/null 2>&1 && sleep 0' 28173 1726882790.61082: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882790.61086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882790.61118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.61123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.61125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.61183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882790.61186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882790.61290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882790.63117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882790.63163: stderr chunk (state=3): >>><<< 28173 1726882790.63170: stdout chunk (state=3): >>><<< 28173 1726882790.63182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882790.63187: handler run complete 28173 1726882790.63207: attempt loop complete, returning result 28173 1726882790.63210: _execute() done 28173 1726882790.63213: dumping result to json 28173 1726882790.63215: done dumping result, returning 28173 1726882790.63223: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-926c-8928-00000000087c] 28173 1726882790.63228: sending task result for task 0e448fcc-3ce9-926c-8928-00000000087c 28173 1726882790.63326: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000087c 28173 1726882790.63329: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28173 1726882790.63386: no more pending results, returning what we have 28173 1726882790.63389: results queue empty 28173 1726882790.63390: checking for any_errors_fatal 28173 1726882790.63400: done checking for any_errors_fatal 28173 1726882790.63400: checking for max_fail_percentage 28173 1726882790.63402: done checking for max_fail_percentage 28173 1726882790.63403: checking to see if all hosts have failed and the running result is not ok 28173 1726882790.63404: done checking to see if all hosts have failed 28173 1726882790.63404: getting the remaining hosts for this loop 28173 1726882790.63406: done getting the remaining hosts for this loop 28173 1726882790.63409: getting the next task for host managed_node2 28173 1726882790.63415: done getting next task for host managed_node2 28173 1726882790.63418: ^ task is: TASK: Set NM profile exist flag based on the profile files 28173 1726882790.63422: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882790.63425: getting variables 28173 1726882790.63427: in VariableManager get_vars() 28173 1726882790.63457: Calling all_inventory to load vars for managed_node2 28173 1726882790.63459: Calling groups_inventory to load vars for managed_node2 28173 1726882790.63464: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882790.63478: Calling all_plugins_play to load vars for managed_node2 28173 1726882790.63481: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882790.63484: Calling groups_plugins_play to load vars for managed_node2 28173 1726882790.64370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882790.65755: done with get_vars() 28173 1726882790.65783: done getting variables 28173 1726882790.65842: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:39:50 -0400 (0:00:00.339) 0:00:43.823 ****** 28173 1726882790.65881: entering _queue_task() for managed_node2/set_fact 28173 1726882790.66192: worker is 1 (out of 1 available) 28173 1726882790.66206: exiting _queue_task() for managed_node2/set_fact 28173 1726882790.66218: done queuing things up, now waiting for results queue to drain 28173 1726882790.66219: waiting for pending results... 28173 1726882790.66511: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 28173 1726882790.66644: in run() - task 0e448fcc-3ce9-926c-8928-00000000087d 28173 1726882790.66675: variable 'ansible_search_path' from source: unknown 28173 1726882790.66686: variable 'ansible_search_path' from source: unknown 28173 1726882790.66727: calling self._execute() 28173 1726882790.66836: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882790.66850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882790.66872: variable 'omit' from source: magic vars 28173 1726882790.67236: variable 'ansible_distribution_major_version' from source: facts 28173 1726882790.67247: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882790.67345: variable 'profile_stat' from source: set_fact 28173 1726882790.67358: Evaluated conditional (profile_stat.stat.exists): False 28173 1726882790.67362: when evaluation is False, skipping this task 28173 1726882790.67366: _execute() done 28173 1726882790.67369: dumping result to json 28173 1726882790.67372: done dumping result, returning 28173 1726882790.67377: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-926c-8928-00000000087d] 28173 1726882790.67384: sending task result for task 0e448fcc-3ce9-926c-8928-00000000087d 28173 1726882790.67465: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000087d 28173 1726882790.67476: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28173 1726882790.67521: no more pending results, returning what we have 28173 1726882790.67524: results queue empty 28173 1726882790.67525: checking for any_errors_fatal 28173 1726882790.67534: done checking for any_errors_fatal 28173 1726882790.67535: checking for max_fail_percentage 28173 1726882790.67536: done checking for max_fail_percentage 28173 1726882790.67537: checking to see if all hosts have failed and the running result is not ok 28173 1726882790.67538: done checking to see if all hosts have failed 28173 1726882790.67539: getting the remaining hosts for this loop 28173 1726882790.67540: done getting the remaining hosts for this loop 28173 1726882790.67543: getting the next task for host managed_node2 28173 1726882790.67548: done getting next task for host managed_node2 28173 1726882790.67551: ^ task is: TASK: Get NM profile info 28173 1726882790.67555: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882790.67559: getting variables 28173 1726882790.67560: in VariableManager get_vars() 28173 1726882790.67588: Calling all_inventory to load vars for managed_node2 28173 1726882790.67591: Calling groups_inventory to load vars for managed_node2 28173 1726882790.67594: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882790.67604: Calling all_plugins_play to load vars for managed_node2 28173 1726882790.67607: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882790.67609: Calling groups_plugins_play to load vars for managed_node2 28173 1726882790.68417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882790.69743: done with get_vars() 28173 1726882790.69771: done getting variables 28173 1726882790.69863: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:39:50 -0400 (0:00:00.040) 0:00:43.863 ****** 28173 1726882790.69896: entering _queue_task() for managed_node2/shell 28173 1726882790.69898: Creating lock for shell 28173 1726882790.70144: worker is 1 (out of 1 available) 28173 1726882790.70157: exiting _queue_task() for managed_node2/shell 28173 1726882790.70172: done queuing things up, now waiting for results queue to drain 28173 1726882790.70173: waiting for pending results... 28173 1726882790.70361: running TaskExecutor() for managed_node2/TASK: Get NM profile info 28173 1726882790.70466: in run() - task 0e448fcc-3ce9-926c-8928-00000000087e 28173 1726882790.70495: variable 'ansible_search_path' from source: unknown 28173 1726882790.70501: variable 'ansible_search_path' from source: unknown 28173 1726882790.70540: calling self._execute() 28173 1726882790.70677: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882790.70693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882790.70706: variable 'omit' from source: magic vars 28173 1726882790.71097: variable 'ansible_distribution_major_version' from source: facts 28173 1726882790.71114: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882790.71125: variable 'omit' from source: magic vars 28173 1726882790.71172: variable 'omit' from source: magic vars 28173 1726882790.71277: variable 'profile' from source: include params 28173 1726882790.71306: variable 'interface' from source: set_fact 28173 1726882790.71430: variable 'interface' from source: set_fact 28173 1726882790.71453: variable 'omit' from source: magic vars 28173 1726882790.71532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882790.71556: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882790.71579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882790.71606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882790.71628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882790.71651: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882790.71654: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882790.71657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882790.71731: Set connection var ansible_pipelining to False 28173 1726882790.71734: Set connection var ansible_shell_type to sh 28173 1726882790.71740: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882790.71747: Set connection var ansible_timeout to 10 28173 1726882790.71751: Set connection var ansible_shell_executable to /bin/sh 28173 1726882790.71756: Set connection var ansible_connection to ssh 28173 1726882790.71785: variable 'ansible_shell_executable' from source: unknown 28173 1726882790.71788: variable 'ansible_connection' from source: unknown 28173 1726882790.71790: variable 'ansible_module_compression' from source: unknown 28173 1726882790.71792: variable 'ansible_shell_type' from source: unknown 28173 1726882790.71794: variable 'ansible_shell_executable' from source: unknown 28173 1726882790.71801: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882790.71804: variable 'ansible_pipelining' from source: unknown 28173 1726882790.71807: variable 'ansible_timeout' from source: unknown 28173 1726882790.71811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882790.71917: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882790.71927: variable 'omit' from source: magic vars 28173 1726882790.71931: starting attempt loop 28173 1726882790.71934: running the handler 28173 1726882790.71942: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882790.71956: _low_level_execute_command(): starting 28173 1726882790.71962: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882790.72460: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.72482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882790.72497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882790.72508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.72553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882790.72568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882790.72688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882790.74346: stdout chunk (state=3): >>>/root <<< 28173 1726882790.74452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882790.74498: stderr chunk (state=3): >>><<< 28173 1726882790.74501: stdout chunk (state=3): >>><<< 28173 1726882790.74522: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882790.74532: _low_level_execute_command(): starting 28173 1726882790.74538: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882790.7452145-30114-137953152208650 `" && echo ansible-tmp-1726882790.7452145-30114-137953152208650="` echo /root/.ansible/tmp/ansible-tmp-1726882790.7452145-30114-137953152208650 `" ) && sleep 0' 28173 1726882790.74970: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882790.74981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882790.75010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.75013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.75016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.75068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882790.75072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882790.75188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882790.77053: stdout chunk (state=3): >>>ansible-tmp-1726882790.7452145-30114-137953152208650=/root/.ansible/tmp/ansible-tmp-1726882790.7452145-30114-137953152208650 <<< 28173 1726882790.77164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882790.77215: stderr chunk (state=3): >>><<< 28173 1726882790.77218: stdout chunk (state=3): >>><<< 28173 1726882790.77233: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882790.7452145-30114-137953152208650=/root/.ansible/tmp/ansible-tmp-1726882790.7452145-30114-137953152208650 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882790.77258: variable 'ansible_module_compression' from source: unknown 28173 1726882790.77304: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28173 1726882790.77335: variable 'ansible_facts' from source: unknown 28173 1726882790.77388: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882790.7452145-30114-137953152208650/AnsiballZ_command.py 28173 1726882790.77491: Sending initial data 28173 1726882790.77502: Sent initial data (156 bytes) 28173 1726882790.78164: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.78170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882790.78202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882790.78206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.78208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.78263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882790.78271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882790.78273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882790.78374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882790.80094: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882790.80189: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882790.80292: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpbr_6faxo /root/.ansible/tmp/ansible-tmp-1726882790.7452145-30114-137953152208650/AnsiballZ_command.py <<< 28173 1726882790.80388: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882790.81409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882790.81503: stderr chunk (state=3): >>><<< 28173 1726882790.81506: stdout chunk (state=3): >>><<< 28173 1726882790.81527: done transferring module to remote 28173 1726882790.81532: _low_level_execute_command(): starting 28173 1726882790.81537: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882790.7452145-30114-137953152208650/ /root/.ansible/tmp/ansible-tmp-1726882790.7452145-30114-137953152208650/AnsiballZ_command.py && sleep 0' 28173 1726882790.81964: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.81968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882790.82000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.82003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.82005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.82059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882790.82062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882790.82171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882790.83923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882790.83986: stderr chunk (state=3): >>><<< 28173 1726882790.83989: stdout chunk (state=3): >>><<< 28173 1726882790.84080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882790.84084: _low_level_execute_command(): starting 28173 1726882790.84087: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882790.7452145-30114-137953152208650/AnsiballZ_command.py && sleep 0' 28173 1726882790.84604: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882790.84607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882790.84647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.84650: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882790.84652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882790.84703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882790.84706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882790.84813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882790.99604: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 21:39:50.976148", "end": "2024-09-20 21:39:50.994011", "delta": "0:00:00.017863", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882791.00716: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 28173 1726882791.00799: stderr chunk (state=3): >>><<< 28173 1726882791.00802: stdout chunk (state=3): >>><<< 28173 1726882791.00933: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 21:39:50.976148", "end": "2024-09-20 21:39:50.994011", "delta": "0:00:00.017863", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 28173 1726882791.00937: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882790.7452145-30114-137953152208650/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882791.00940: _low_level_execute_command(): starting 28173 1726882791.00942: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882790.7452145-30114-137953152208650/ > /dev/null 2>&1 && sleep 0' 28173 1726882791.01484: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882791.01497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882791.01510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.01528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882791.01573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882791.01586: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882791.01605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.01623: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882791.01636: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882791.01647: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882791.01666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882791.01685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.01702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882791.01716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882791.01728: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882791.01743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.01820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882791.01841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882791.01857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882791.02006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882791.03786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882791.03830: stderr chunk (state=3): >>><<< 28173 1726882791.03833: stdout chunk (state=3): >>><<< 28173 1726882791.03846: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882791.03853: handler run complete 28173 1726882791.03877: Evaluated conditional (False): False 28173 1726882791.03885: attempt loop complete, returning result 28173 1726882791.03888: _execute() done 28173 1726882791.03890: dumping result to json 28173 1726882791.03895: done dumping result, returning 28173 1726882791.03902: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-926c-8928-00000000087e] 28173 1726882791.03908: sending task result for task 0e448fcc-3ce9-926c-8928-00000000087e 28173 1726882791.04007: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000087e 28173 1726882791.04009: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.017863", "end": "2024-09-20 21:39:50.994011", "rc": 1, "start": "2024-09-20 21:39:50.976148" } MSG: non-zero return code ...ignoring 28173 1726882791.04105: no more pending results, returning what we have 28173 1726882791.04108: results queue empty 28173 1726882791.04109: checking for any_errors_fatal 28173 1726882791.04116: done checking for any_errors_fatal 28173 1726882791.04116: checking for max_fail_percentage 28173 1726882791.04118: done checking for max_fail_percentage 28173 1726882791.04119: checking to see if all hosts have failed and the running result is not ok 28173 1726882791.04120: done checking to see if all hosts have failed 28173 1726882791.04120: getting the remaining hosts for this loop 28173 1726882791.04122: done getting the remaining hosts for this loop 28173 1726882791.04125: getting the next task for host managed_node2 28173 1726882791.04131: done getting next task for host managed_node2 28173 1726882791.04134: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28173 1726882791.04138: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882791.04141: getting variables 28173 1726882791.04143: in VariableManager get_vars() 28173 1726882791.04202: Calling all_inventory to load vars for managed_node2 28173 1726882791.04205: Calling groups_inventory to load vars for managed_node2 28173 1726882791.04209: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.04219: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.04222: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.04224: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.05658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.07191: done with get_vars() 28173 1726882791.07207: done getting variables 28173 1726882791.07250: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:39:51 -0400 (0:00:00.373) 0:00:44.237 ****** 28173 1726882791.07279: entering _queue_task() for managed_node2/set_fact 28173 1726882791.07497: worker is 1 (out of 1 available) 28173 1726882791.07510: exiting _queue_task() for managed_node2/set_fact 28173 1726882791.07522: done queuing things up, now waiting for results queue to drain 28173 1726882791.07523: waiting for pending results... 28173 1726882791.07695: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28173 1726882791.07771: in run() - task 0e448fcc-3ce9-926c-8928-00000000087f 28173 1726882791.07791: variable 'ansible_search_path' from source: unknown 28173 1726882791.07794: variable 'ansible_search_path' from source: unknown 28173 1726882791.07825: calling self._execute() 28173 1726882791.07917: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.07925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.07934: variable 'omit' from source: magic vars 28173 1726882791.08220: variable 'ansible_distribution_major_version' from source: facts 28173 1726882791.08231: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882791.08325: variable 'nm_profile_exists' from source: set_fact 28173 1726882791.08336: Evaluated conditional (nm_profile_exists.rc == 0): False 28173 1726882791.08339: when evaluation is False, skipping this task 28173 1726882791.08342: _execute() done 28173 1726882791.08346: dumping result to json 28173 1726882791.08349: done dumping result, returning 28173 1726882791.08352: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-926c-8928-00000000087f] 28173 1726882791.08359: sending task result for task 0e448fcc-3ce9-926c-8928-00000000087f 28173 1726882791.08444: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000087f 28173 1726882791.08448: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 28173 1726882791.08496: no more pending results, returning what we have 28173 1726882791.08500: results queue empty 28173 1726882791.08501: checking for any_errors_fatal 28173 1726882791.08509: done checking for any_errors_fatal 28173 1726882791.08510: checking for max_fail_percentage 28173 1726882791.08511: done checking for max_fail_percentage 28173 1726882791.08512: checking to see if all hosts have failed and the running result is not ok 28173 1726882791.08513: done checking to see if all hosts have failed 28173 1726882791.08513: getting the remaining hosts for this loop 28173 1726882791.08515: done getting the remaining hosts for this loop 28173 1726882791.08519: getting the next task for host managed_node2 28173 1726882791.08526: done getting next task for host managed_node2 28173 1726882791.08529: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 28173 1726882791.08533: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882791.08536: getting variables 28173 1726882791.08538: in VariableManager get_vars() 28173 1726882791.08562: Calling all_inventory to load vars for managed_node2 28173 1726882791.08565: Calling groups_inventory to load vars for managed_node2 28173 1726882791.08571: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.08582: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.08584: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.08587: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.09907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.10906: done with get_vars() 28173 1726882791.10921: done getting variables 28173 1726882791.10962: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882791.11049: variable 'profile' from source: include params 28173 1726882791.11052: variable 'interface' from source: set_fact 28173 1726882791.11100: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:39:51 -0400 (0:00:00.038) 0:00:44.275 ****** 28173 1726882791.11125: entering _queue_task() for managed_node2/command 28173 1726882791.11322: worker is 1 (out of 1 available) 28173 1726882791.11335: exiting _queue_task() for managed_node2/command 28173 1726882791.11345: done queuing things up, now waiting for results queue to drain 28173 1726882791.11347: waiting for pending results... 28173 1726882791.11511: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 28173 1726882791.11587: in run() - task 0e448fcc-3ce9-926c-8928-000000000881 28173 1726882791.11597: variable 'ansible_search_path' from source: unknown 28173 1726882791.11600: variable 'ansible_search_path' from source: unknown 28173 1726882791.11627: calling self._execute() 28173 1726882791.11705: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.11708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.11717: variable 'omit' from source: magic vars 28173 1726882791.11971: variable 'ansible_distribution_major_version' from source: facts 28173 1726882791.11981: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882791.12065: variable 'profile_stat' from source: set_fact 28173 1726882791.12076: Evaluated conditional (profile_stat.stat.exists): False 28173 1726882791.12079: when evaluation is False, skipping this task 28173 1726882791.12082: _execute() done 28173 1726882791.12086: dumping result to json 28173 1726882791.12089: done dumping result, returning 28173 1726882791.12092: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [0e448fcc-3ce9-926c-8928-000000000881] 28173 1726882791.12098: sending task result for task 0e448fcc-3ce9-926c-8928-000000000881 28173 1726882791.12187: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000881 28173 1726882791.12190: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28173 1726882791.12261: no more pending results, returning what we have 28173 1726882791.12266: results queue empty 28173 1726882791.12267: checking for any_errors_fatal 28173 1726882791.12272: done checking for any_errors_fatal 28173 1726882791.12273: checking for max_fail_percentage 28173 1726882791.12274: done checking for max_fail_percentage 28173 1726882791.12275: checking to see if all hosts have failed and the running result is not ok 28173 1726882791.12277: done checking to see if all hosts have failed 28173 1726882791.12277: getting the remaining hosts for this loop 28173 1726882791.12279: done getting the remaining hosts for this loop 28173 1726882791.12282: getting the next task for host managed_node2 28173 1726882791.12286: done getting next task for host managed_node2 28173 1726882791.12289: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 28173 1726882791.12292: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882791.12296: getting variables 28173 1726882791.12297: in VariableManager get_vars() 28173 1726882791.12326: Calling all_inventory to load vars for managed_node2 28173 1726882791.12328: Calling groups_inventory to load vars for managed_node2 28173 1726882791.12331: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.12338: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.12339: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.12341: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.13234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.14186: done with get_vars() 28173 1726882791.14201: done getting variables 28173 1726882791.14241: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882791.14314: variable 'profile' from source: include params 28173 1726882791.14316: variable 'interface' from source: set_fact 28173 1726882791.14351: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:39:51 -0400 (0:00:00.032) 0:00:44.308 ****** 28173 1726882791.14378: entering _queue_task() for managed_node2/set_fact 28173 1726882791.14559: worker is 1 (out of 1 available) 28173 1726882791.14573: exiting _queue_task() for managed_node2/set_fact 28173 1726882791.14585: done queuing things up, now waiting for results queue to drain 28173 1726882791.14586: waiting for pending results... 28173 1726882791.14751: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 28173 1726882791.14822: in run() - task 0e448fcc-3ce9-926c-8928-000000000882 28173 1726882791.14833: variable 'ansible_search_path' from source: unknown 28173 1726882791.14836: variable 'ansible_search_path' from source: unknown 28173 1726882791.14868: calling self._execute() 28173 1726882791.14947: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.14950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.14961: variable 'omit' from source: magic vars 28173 1726882791.15217: variable 'ansible_distribution_major_version' from source: facts 28173 1726882791.15227: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882791.15312: variable 'profile_stat' from source: set_fact 28173 1726882791.15322: Evaluated conditional (profile_stat.stat.exists): False 28173 1726882791.15325: when evaluation is False, skipping this task 28173 1726882791.15328: _execute() done 28173 1726882791.15330: dumping result to json 28173 1726882791.15333: done dumping result, returning 28173 1726882791.15338: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [0e448fcc-3ce9-926c-8928-000000000882] 28173 1726882791.15344: sending task result for task 0e448fcc-3ce9-926c-8928-000000000882 28173 1726882791.15428: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000882 28173 1726882791.15431: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28173 1726882791.15505: no more pending results, returning what we have 28173 1726882791.15508: results queue empty 28173 1726882791.15509: checking for any_errors_fatal 28173 1726882791.15514: done checking for any_errors_fatal 28173 1726882791.15515: checking for max_fail_percentage 28173 1726882791.15517: done checking for max_fail_percentage 28173 1726882791.15517: checking to see if all hosts have failed and the running result is not ok 28173 1726882791.15518: done checking to see if all hosts have failed 28173 1726882791.15519: getting the remaining hosts for this loop 28173 1726882791.15520: done getting the remaining hosts for this loop 28173 1726882791.15523: getting the next task for host managed_node2 28173 1726882791.15528: done getting next task for host managed_node2 28173 1726882791.15530: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 28173 1726882791.15534: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882791.15537: getting variables 28173 1726882791.15538: in VariableManager get_vars() 28173 1726882791.15559: Calling all_inventory to load vars for managed_node2 28173 1726882791.15560: Calling groups_inventory to load vars for managed_node2 28173 1726882791.15562: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.15576: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.15578: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.15580: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.16450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.17412: done with get_vars() 28173 1726882791.17427: done getting variables 28173 1726882791.17469: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882791.17544: variable 'profile' from source: include params 28173 1726882791.17547: variable 'interface' from source: set_fact 28173 1726882791.17587: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:39:51 -0400 (0:00:00.032) 0:00:44.340 ****** 28173 1726882791.17609: entering _queue_task() for managed_node2/command 28173 1726882791.17802: worker is 1 (out of 1 available) 28173 1726882791.17814: exiting _queue_task() for managed_node2/command 28173 1726882791.17827: done queuing things up, now waiting for results queue to drain 28173 1726882791.17828: waiting for pending results... 28173 1726882791.18003: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 28173 1726882791.18080: in run() - task 0e448fcc-3ce9-926c-8928-000000000883 28173 1726882791.18091: variable 'ansible_search_path' from source: unknown 28173 1726882791.18094: variable 'ansible_search_path' from source: unknown 28173 1726882791.18121: calling self._execute() 28173 1726882791.18205: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.18208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.18217: variable 'omit' from source: magic vars 28173 1726882791.18485: variable 'ansible_distribution_major_version' from source: facts 28173 1726882791.18496: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882791.18580: variable 'profile_stat' from source: set_fact 28173 1726882791.18591: Evaluated conditional (profile_stat.stat.exists): False 28173 1726882791.18595: when evaluation is False, skipping this task 28173 1726882791.18598: _execute() done 28173 1726882791.18601: dumping result to json 28173 1726882791.18603: done dumping result, returning 28173 1726882791.18608: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 [0e448fcc-3ce9-926c-8928-000000000883] 28173 1726882791.18613: sending task result for task 0e448fcc-3ce9-926c-8928-000000000883 28173 1726882791.18699: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000883 28173 1726882791.18701: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28173 1726882791.18774: no more pending results, returning what we have 28173 1726882791.18777: results queue empty 28173 1726882791.18778: checking for any_errors_fatal 28173 1726882791.18784: done checking for any_errors_fatal 28173 1726882791.18784: checking for max_fail_percentage 28173 1726882791.18786: done checking for max_fail_percentage 28173 1726882791.18787: checking to see if all hosts have failed and the running result is not ok 28173 1726882791.18788: done checking to see if all hosts have failed 28173 1726882791.18788: getting the remaining hosts for this loop 28173 1726882791.18790: done getting the remaining hosts for this loop 28173 1726882791.18792: getting the next task for host managed_node2 28173 1726882791.18797: done getting next task for host managed_node2 28173 1726882791.18799: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 28173 1726882791.18803: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882791.18806: getting variables 28173 1726882791.18807: in VariableManager get_vars() 28173 1726882791.18834: Calling all_inventory to load vars for managed_node2 28173 1726882791.18837: Calling groups_inventory to load vars for managed_node2 28173 1726882791.18840: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.18848: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.18851: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.18853: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.19648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.20624: done with get_vars() 28173 1726882791.20639: done getting variables 28173 1726882791.20688: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882791.20757: variable 'profile' from source: include params 28173 1726882791.20760: variable 'interface' from source: set_fact 28173 1726882791.20803: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:39:51 -0400 (0:00:00.032) 0:00:44.372 ****** 28173 1726882791.20824: entering _queue_task() for managed_node2/set_fact 28173 1726882791.21015: worker is 1 (out of 1 available) 28173 1726882791.21028: exiting _queue_task() for managed_node2/set_fact 28173 1726882791.21039: done queuing things up, now waiting for results queue to drain 28173 1726882791.21041: waiting for pending results... 28173 1726882791.21208: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 28173 1726882791.21276: in run() - task 0e448fcc-3ce9-926c-8928-000000000884 28173 1726882791.21286: variable 'ansible_search_path' from source: unknown 28173 1726882791.21290: variable 'ansible_search_path' from source: unknown 28173 1726882791.21316: calling self._execute() 28173 1726882791.21393: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.21397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.21408: variable 'omit' from source: magic vars 28173 1726882791.21668: variable 'ansible_distribution_major_version' from source: facts 28173 1726882791.21677: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882791.21758: variable 'profile_stat' from source: set_fact 28173 1726882791.21772: Evaluated conditional (profile_stat.stat.exists): False 28173 1726882791.21775: when evaluation is False, skipping this task 28173 1726882791.21778: _execute() done 28173 1726882791.21781: dumping result to json 28173 1726882791.21783: done dumping result, returning 28173 1726882791.21786: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [0e448fcc-3ce9-926c-8928-000000000884] 28173 1726882791.21790: sending task result for task 0e448fcc-3ce9-926c-8928-000000000884 28173 1726882791.21880: done sending task result for task 0e448fcc-3ce9-926c-8928-000000000884 28173 1726882791.21883: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28173 1726882791.21944: no more pending results, returning what we have 28173 1726882791.21946: results queue empty 28173 1726882791.21947: checking for any_errors_fatal 28173 1726882791.21951: done checking for any_errors_fatal 28173 1726882791.21952: checking for max_fail_percentage 28173 1726882791.21953: done checking for max_fail_percentage 28173 1726882791.21954: checking to see if all hosts have failed and the running result is not ok 28173 1726882791.21955: done checking to see if all hosts have failed 28173 1726882791.21955: getting the remaining hosts for this loop 28173 1726882791.21956: done getting the remaining hosts for this loop 28173 1726882791.21959: getting the next task for host managed_node2 28173 1726882791.21970: done getting next task for host managed_node2 28173 1726882791.21973: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 28173 1726882791.21979: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882791.21984: getting variables 28173 1726882791.21985: in VariableManager get_vars() 28173 1726882791.22003: Calling all_inventory to load vars for managed_node2 28173 1726882791.22004: Calling groups_inventory to load vars for managed_node2 28173 1726882791.22006: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.22013: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.22015: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.22016: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.22883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.23863: done with get_vars() 28173 1726882791.23882: done getting variables 28173 1726882791.23923: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882791.23997: variable 'profile' from source: include params 28173 1726882791.24000: variable 'interface' from source: set_fact 28173 1726882791.24037: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:39:51 -0400 (0:00:00.032) 0:00:44.405 ****** 28173 1726882791.24058: entering _queue_task() for managed_node2/assert 28173 1726882791.24237: worker is 1 (out of 1 available) 28173 1726882791.24251: exiting _queue_task() for managed_node2/assert 28173 1726882791.24262: done queuing things up, now waiting for results queue to drain 28173 1726882791.24271: waiting for pending results... 28173 1726882791.24424: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest0' 28173 1726882791.24507: in run() - task 0e448fcc-3ce9-926c-8928-00000000086d 28173 1726882791.24526: variable 'ansible_search_path' from source: unknown 28173 1726882791.24532: variable 'ansible_search_path' from source: unknown 28173 1726882791.24570: calling self._execute() 28173 1726882791.24709: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.24720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.24735: variable 'omit' from source: magic vars 28173 1726882791.25082: variable 'ansible_distribution_major_version' from source: facts 28173 1726882791.25099: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882791.25109: variable 'omit' from source: magic vars 28173 1726882791.25156: variable 'omit' from source: magic vars 28173 1726882791.25257: variable 'profile' from source: include params 28173 1726882791.25268: variable 'interface' from source: set_fact 28173 1726882791.25333: variable 'interface' from source: set_fact 28173 1726882791.25361: variable 'omit' from source: magic vars 28173 1726882791.25432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882791.25493: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882791.25545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882791.25589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882791.25604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882791.25636: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882791.25645: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.25659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.25756: Set connection var ansible_pipelining to False 28173 1726882791.25759: Set connection var ansible_shell_type to sh 28173 1726882791.25770: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882791.25779: Set connection var ansible_timeout to 10 28173 1726882791.25789: Set connection var ansible_shell_executable to /bin/sh 28173 1726882791.25803: Set connection var ansible_connection to ssh 28173 1726882791.25833: variable 'ansible_shell_executable' from source: unknown 28173 1726882791.25836: variable 'ansible_connection' from source: unknown 28173 1726882791.25839: variable 'ansible_module_compression' from source: unknown 28173 1726882791.25841: variable 'ansible_shell_type' from source: unknown 28173 1726882791.25846: variable 'ansible_shell_executable' from source: unknown 28173 1726882791.25852: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.25855: variable 'ansible_pipelining' from source: unknown 28173 1726882791.25858: variable 'ansible_timeout' from source: unknown 28173 1726882791.25862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.25970: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882791.25982: variable 'omit' from source: magic vars 28173 1726882791.25985: starting attempt loop 28173 1726882791.25988: running the handler 28173 1726882791.26071: variable 'lsr_net_profile_exists' from source: set_fact 28173 1726882791.26080: Evaluated conditional (not lsr_net_profile_exists): True 28173 1726882791.26086: handler run complete 28173 1726882791.26097: attempt loop complete, returning result 28173 1726882791.26100: _execute() done 28173 1726882791.26102: dumping result to json 28173 1726882791.26106: done dumping result, returning 28173 1726882791.26111: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest0' [0e448fcc-3ce9-926c-8928-00000000086d] 28173 1726882791.26119: sending task result for task 0e448fcc-3ce9-926c-8928-00000000086d 28173 1726882791.26200: done sending task result for task 0e448fcc-3ce9-926c-8928-00000000086d 28173 1726882791.26202: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28173 1726882791.26247: no more pending results, returning what we have 28173 1726882791.26250: results queue empty 28173 1726882791.26251: checking for any_errors_fatal 28173 1726882791.26257: done checking for any_errors_fatal 28173 1726882791.26258: checking for max_fail_percentage 28173 1726882791.26259: done checking for max_fail_percentage 28173 1726882791.26261: checking to see if all hosts have failed and the running result is not ok 28173 1726882791.26262: done checking to see if all hosts have failed 28173 1726882791.26262: getting the remaining hosts for this loop 28173 1726882791.26265: done getting the remaining hosts for this loop 28173 1726882791.26269: getting the next task for host managed_node2 28173 1726882791.26275: done getting next task for host managed_node2 28173 1726882791.26279: ^ task is: TASK: Include the task 'assert_device_absent.yml' 28173 1726882791.26280: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882791.26283: getting variables 28173 1726882791.26285: in VariableManager get_vars() 28173 1726882791.26309: Calling all_inventory to load vars for managed_node2 28173 1726882791.26312: Calling groups_inventory to load vars for managed_node2 28173 1726882791.26315: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.26324: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.26327: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.26329: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.27208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.28768: done with get_vars() 28173 1726882791.28791: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:156 Friday 20 September 2024 21:39:51 -0400 (0:00:00.048) 0:00:44.453 ****** 28173 1726882791.28883: entering _queue_task() for managed_node2/include_tasks 28173 1726882791.29152: worker is 1 (out of 1 available) 28173 1726882791.29167: exiting _queue_task() for managed_node2/include_tasks 28173 1726882791.29179: done queuing things up, now waiting for results queue to drain 28173 1726882791.29180: waiting for pending results... 28173 1726882791.29460: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' 28173 1726882791.29565: in run() - task 0e448fcc-3ce9-926c-8928-0000000000f0 28173 1726882791.29586: variable 'ansible_search_path' from source: unknown 28173 1726882791.29630: calling self._execute() 28173 1726882791.29735: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.29748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.29766: variable 'omit' from source: magic vars 28173 1726882791.30139: variable 'ansible_distribution_major_version' from source: facts 28173 1726882791.30160: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882791.30177: _execute() done 28173 1726882791.30186: dumping result to json 28173 1726882791.30193: done dumping result, returning 28173 1726882791.30206: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' [0e448fcc-3ce9-926c-8928-0000000000f0] 28173 1726882791.30211: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000f0 28173 1726882791.30309: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000f0 28173 1726882791.30313: WORKER PROCESS EXITING 28173 1726882791.30343: no more pending results, returning what we have 28173 1726882791.30348: in VariableManager get_vars() 28173 1726882791.30384: Calling all_inventory to load vars for managed_node2 28173 1726882791.30386: Calling groups_inventory to load vars for managed_node2 28173 1726882791.30390: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.30402: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.30404: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.30407: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.31223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.32412: done with get_vars() 28173 1726882791.32430: variable 'ansible_search_path' from source: unknown 28173 1726882791.32442: we have included files to process 28173 1726882791.32443: generating all_blocks data 28173 1726882791.32445: done generating all_blocks data 28173 1726882791.32451: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28173 1726882791.32452: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28173 1726882791.32454: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28173 1726882791.32603: in VariableManager get_vars() 28173 1726882791.32617: done with get_vars() 28173 1726882791.32722: done processing included file 28173 1726882791.32724: iterating over new_blocks loaded from include file 28173 1726882791.32726: in VariableManager get_vars() 28173 1726882791.32737: done with get_vars() 28173 1726882791.32738: filtering new block on tags 28173 1726882791.32755: done filtering new block on tags 28173 1726882791.32758: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 28173 1726882791.32763: extending task lists for all hosts with included blocks 28173 1726882791.32930: done extending task lists 28173 1726882791.32931: done processing included files 28173 1726882791.32932: results queue empty 28173 1726882791.32932: checking for any_errors_fatal 28173 1726882791.32935: done checking for any_errors_fatal 28173 1726882791.32936: checking for max_fail_percentage 28173 1726882791.32937: done checking for max_fail_percentage 28173 1726882791.32938: checking to see if all hosts have failed and the running result is not ok 28173 1726882791.32939: done checking to see if all hosts have failed 28173 1726882791.32939: getting the remaining hosts for this loop 28173 1726882791.32941: done getting the remaining hosts for this loop 28173 1726882791.32943: getting the next task for host managed_node2 28173 1726882791.32948: done getting next task for host managed_node2 28173 1726882791.32950: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28173 1726882791.32952: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882791.32955: getting variables 28173 1726882791.32955: in VariableManager get_vars() 28173 1726882791.32965: Calling all_inventory to load vars for managed_node2 28173 1726882791.32968: Calling groups_inventory to load vars for managed_node2 28173 1726882791.32970: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.32974: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.32977: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.32980: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.33842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.34772: done with get_vars() 28173 1726882791.34786: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:39:51 -0400 (0:00:00.059) 0:00:44.512 ****** 28173 1726882791.34836: entering _queue_task() for managed_node2/include_tasks 28173 1726882791.35042: worker is 1 (out of 1 available) 28173 1726882791.35055: exiting _queue_task() for managed_node2/include_tasks 28173 1726882791.35069: done queuing things up, now waiting for results queue to drain 28173 1726882791.35071: waiting for pending results... 28173 1726882791.35286: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 28173 1726882791.35460: in run() - task 0e448fcc-3ce9-926c-8928-0000000008b5 28173 1726882791.35494: variable 'ansible_search_path' from source: unknown 28173 1726882791.35508: variable 'ansible_search_path' from source: unknown 28173 1726882791.35553: calling self._execute() 28173 1726882791.35680: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.35692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.35706: variable 'omit' from source: magic vars 28173 1726882791.36115: variable 'ansible_distribution_major_version' from source: facts 28173 1726882791.36135: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882791.36147: _execute() done 28173 1726882791.36154: dumping result to json 28173 1726882791.36161: done dumping result, returning 28173 1726882791.36184: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-926c-8928-0000000008b5] 28173 1726882791.36196: sending task result for task 0e448fcc-3ce9-926c-8928-0000000008b5 28173 1726882791.36306: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000008b5 28173 1726882791.36314: WORKER PROCESS EXITING 28173 1726882791.36345: no more pending results, returning what we have 28173 1726882791.36351: in VariableManager get_vars() 28173 1726882791.36389: Calling all_inventory to load vars for managed_node2 28173 1726882791.36392: Calling groups_inventory to load vars for managed_node2 28173 1726882791.36396: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.36411: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.36414: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.36417: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.37693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.38646: done with get_vars() 28173 1726882791.38672: variable 'ansible_search_path' from source: unknown 28173 1726882791.38673: variable 'ansible_search_path' from source: unknown 28173 1726882791.38702: we have included files to process 28173 1726882791.38703: generating all_blocks data 28173 1726882791.38704: done generating all_blocks data 28173 1726882791.38705: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28173 1726882791.38705: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28173 1726882791.38707: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28173 1726882791.38853: done processing included file 28173 1726882791.38855: iterating over new_blocks loaded from include file 28173 1726882791.38856: in VariableManager get_vars() 28173 1726882791.38872: done with get_vars() 28173 1726882791.38873: filtering new block on tags 28173 1726882791.38887: done filtering new block on tags 28173 1726882791.38889: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 28173 1726882791.38892: extending task lists for all hosts with included blocks 28173 1726882791.38969: done extending task lists 28173 1726882791.38970: done processing included files 28173 1726882791.38971: results queue empty 28173 1726882791.38972: checking for any_errors_fatal 28173 1726882791.38974: done checking for any_errors_fatal 28173 1726882791.38975: checking for max_fail_percentage 28173 1726882791.38976: done checking for max_fail_percentage 28173 1726882791.38976: checking to see if all hosts have failed and the running result is not ok 28173 1726882791.38977: done checking to see if all hosts have failed 28173 1726882791.38978: getting the remaining hosts for this loop 28173 1726882791.38979: done getting the remaining hosts for this loop 28173 1726882791.38980: getting the next task for host managed_node2 28173 1726882791.38983: done getting next task for host managed_node2 28173 1726882791.38985: ^ task is: TASK: Get stat for interface {{ interface }} 28173 1726882791.38987: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882791.38988: getting variables 28173 1726882791.38989: in VariableManager get_vars() 28173 1726882791.38995: Calling all_inventory to load vars for managed_node2 28173 1726882791.38997: Calling groups_inventory to load vars for managed_node2 28173 1726882791.38998: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.39002: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.39003: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.39005: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.40117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.42049: done with get_vars() 28173 1726882791.42073: done getting variables 28173 1726882791.42228: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:39:51 -0400 (0:00:00.074) 0:00:44.586 ****** 28173 1726882791.42259: entering _queue_task() for managed_node2/stat 28173 1726882791.42557: worker is 1 (out of 1 available) 28173 1726882791.42578: exiting _queue_task() for managed_node2/stat 28173 1726882791.42617: done queuing things up, now waiting for results queue to drain 28173 1726882791.42619: waiting for pending results... 28173 1726882791.42782: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 28173 1726882791.42859: in run() - task 0e448fcc-3ce9-926c-8928-0000000008cf 28173 1726882791.42874: variable 'ansible_search_path' from source: unknown 28173 1726882791.42878: variable 'ansible_search_path' from source: unknown 28173 1726882791.42906: calling self._execute() 28173 1726882791.42981: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.42985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.42995: variable 'omit' from source: magic vars 28173 1726882791.43263: variable 'ansible_distribution_major_version' from source: facts 28173 1726882791.43276: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882791.43281: variable 'omit' from source: magic vars 28173 1726882791.43312: variable 'omit' from source: magic vars 28173 1726882791.43423: variable 'interface' from source: set_fact 28173 1726882791.43445: variable 'omit' from source: magic vars 28173 1726882791.43500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882791.43539: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882791.43558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882791.43593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882791.43610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882791.43647: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882791.43651: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.43653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.43762: Set connection var ansible_pipelining to False 28173 1726882791.43777: Set connection var ansible_shell_type to sh 28173 1726882791.43797: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882791.43808: Set connection var ansible_timeout to 10 28173 1726882791.43817: Set connection var ansible_shell_executable to /bin/sh 28173 1726882791.43825: Set connection var ansible_connection to ssh 28173 1726882791.43850: variable 'ansible_shell_executable' from source: unknown 28173 1726882791.43857: variable 'ansible_connection' from source: unknown 28173 1726882791.43865: variable 'ansible_module_compression' from source: unknown 28173 1726882791.43874: variable 'ansible_shell_type' from source: unknown 28173 1726882791.43882: variable 'ansible_shell_executable' from source: unknown 28173 1726882791.43889: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.43899: variable 'ansible_pipelining' from source: unknown 28173 1726882791.43910: variable 'ansible_timeout' from source: unknown 28173 1726882791.43917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.44131: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28173 1726882791.44147: variable 'omit' from source: magic vars 28173 1726882791.44157: starting attempt loop 28173 1726882791.44169: running the handler 28173 1726882791.44189: _low_level_execute_command(): starting 28173 1726882791.44203: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882791.45089: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882791.45115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.45120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.45123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.45286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882791.45560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882791.46968: stdout chunk (state=3): >>>/root <<< 28173 1726882791.47071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882791.47137: stderr chunk (state=3): >>><<< 28173 1726882791.47140: stdout chunk (state=3): >>><<< 28173 1726882791.47251: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882791.47255: _low_level_execute_command(): starting 28173 1726882791.47258: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882791.4715774-30153-6615758310167 `" && echo ansible-tmp-1726882791.4715774-30153-6615758310167="` echo /root/.ansible/tmp/ansible-tmp-1726882791.4715774-30153-6615758310167 `" ) && sleep 0' 28173 1726882791.47848: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882791.47862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882791.47884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.47918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882791.47980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882791.47995: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882791.48025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.48045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882791.48058: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882791.48076: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882791.48090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882791.48104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.48124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882791.48142: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882791.48154: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882791.48175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.48268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882791.48297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882791.48314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882791.48452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882791.50425: stdout chunk (state=3): >>>ansible-tmp-1726882791.4715774-30153-6615758310167=/root/.ansible/tmp/ansible-tmp-1726882791.4715774-30153-6615758310167 <<< 28173 1726882791.50619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882791.50622: stdout chunk (state=3): >>><<< 28173 1726882791.50624: stderr chunk (state=3): >>><<< 28173 1726882791.50681: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882791.4715774-30153-6615758310167=/root/.ansible/tmp/ansible-tmp-1726882791.4715774-30153-6615758310167 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882791.50755: variable 'ansible_module_compression' from source: unknown 28173 1726882791.50774: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28173 1726882791.50804: variable 'ansible_facts' from source: unknown 28173 1726882791.50883: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882791.4715774-30153-6615758310167/AnsiballZ_stat.py 28173 1726882791.50983: Sending initial data 28173 1726882791.50992: Sent initial data (151 bytes) 28173 1726882791.51641: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882791.51644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.51647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882791.51684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.51687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.51689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.51736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882791.51745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882791.51858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882791.53613: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882791.53707: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882791.53806: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpp0koa1bi /root/.ansible/tmp/ansible-tmp-1726882791.4715774-30153-6615758310167/AnsiballZ_stat.py <<< 28173 1726882791.53901: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882791.54910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882791.55014: stderr chunk (state=3): >>><<< 28173 1726882791.55017: stdout chunk (state=3): >>><<< 28173 1726882791.55032: done transferring module to remote 28173 1726882791.55041: _low_level_execute_command(): starting 28173 1726882791.55045: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882791.4715774-30153-6615758310167/ /root/.ansible/tmp/ansible-tmp-1726882791.4715774-30153-6615758310167/AnsiballZ_stat.py && sleep 0' 28173 1726882791.55680: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882791.55695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882791.55711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.55729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882791.55791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882791.55805: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882791.55820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.55838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882791.55852: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882791.55880: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882791.55894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882791.55909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.55925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882791.55937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882791.55947: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882791.55959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.56041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882791.56060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882791.56087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882791.56214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882791.57972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882791.58017: stderr chunk (state=3): >>><<< 28173 1726882791.58020: stdout chunk (state=3): >>><<< 28173 1726882791.58035: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882791.58039: _low_level_execute_command(): starting 28173 1726882791.58047: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882791.4715774-30153-6615758310167/AnsiballZ_stat.py && sleep 0' 28173 1726882791.58517: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882791.58549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.58552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.58554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.58629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882791.58632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882791.58743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882791.71941: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28173 1726882791.72920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882791.72974: stderr chunk (state=3): >>><<< 28173 1726882791.72978: stdout chunk (state=3): >>><<< 28173 1726882791.72990: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882791.73012: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882791.4715774-30153-6615758310167/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882791.73020: _low_level_execute_command(): starting 28173 1726882791.73025: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882791.4715774-30153-6615758310167/ > /dev/null 2>&1 && sleep 0' 28173 1726882791.73549: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.73552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882791.73597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.73601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882791.73607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.73614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.73662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882791.73691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882791.73820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882791.75626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882791.75669: stderr chunk (state=3): >>><<< 28173 1726882791.75675: stdout chunk (state=3): >>><<< 28173 1726882791.75690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882791.75697: handler run complete 28173 1726882791.75714: attempt loop complete, returning result 28173 1726882791.75716: _execute() done 28173 1726882791.75719: dumping result to json 28173 1726882791.75721: done dumping result, returning 28173 1726882791.75728: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 [0e448fcc-3ce9-926c-8928-0000000008cf] 28173 1726882791.75734: sending task result for task 0e448fcc-3ce9-926c-8928-0000000008cf 28173 1726882791.75827: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000008cf 28173 1726882791.75829: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28173 1726882791.75912: no more pending results, returning what we have 28173 1726882791.75916: results queue empty 28173 1726882791.75917: checking for any_errors_fatal 28173 1726882791.75918: done checking for any_errors_fatal 28173 1726882791.75919: checking for max_fail_percentage 28173 1726882791.75921: done checking for max_fail_percentage 28173 1726882791.75922: checking to see if all hosts have failed and the running result is not ok 28173 1726882791.75923: done checking to see if all hosts have failed 28173 1726882791.75924: getting the remaining hosts for this loop 28173 1726882791.75926: done getting the remaining hosts for this loop 28173 1726882791.75929: getting the next task for host managed_node2 28173 1726882791.75936: done getting next task for host managed_node2 28173 1726882791.75941: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 28173 1726882791.75944: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882791.75947: getting variables 28173 1726882791.75948: in VariableManager get_vars() 28173 1726882791.75978: Calling all_inventory to load vars for managed_node2 28173 1726882791.75980: Calling groups_inventory to load vars for managed_node2 28173 1726882791.75984: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.75994: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.75996: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.75998: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.81243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.82216: done with get_vars() 28173 1726882791.82232: done getting variables 28173 1726882791.82267: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28173 1726882791.82338: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:39:51 -0400 (0:00:00.400) 0:00:44.987 ****** 28173 1726882791.82356: entering _queue_task() for managed_node2/assert 28173 1726882791.82594: worker is 1 (out of 1 available) 28173 1726882791.82608: exiting _queue_task() for managed_node2/assert 28173 1726882791.82618: done queuing things up, now waiting for results queue to drain 28173 1726882791.82620: waiting for pending results... 28173 1726882791.82801: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest0' 28173 1726882791.82884: in run() - task 0e448fcc-3ce9-926c-8928-0000000008b6 28173 1726882791.82894: variable 'ansible_search_path' from source: unknown 28173 1726882791.82897: variable 'ansible_search_path' from source: unknown 28173 1726882791.82927: calling self._execute() 28173 1726882791.83008: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.83011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.83020: variable 'omit' from source: magic vars 28173 1726882791.83298: variable 'ansible_distribution_major_version' from source: facts 28173 1726882791.83309: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882791.83314: variable 'omit' from source: magic vars 28173 1726882791.83343: variable 'omit' from source: magic vars 28173 1726882791.83414: variable 'interface' from source: set_fact 28173 1726882791.83429: variable 'omit' from source: magic vars 28173 1726882791.83461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882791.83490: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882791.83507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882791.83525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882791.83535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882791.83558: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882791.83561: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.83566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.83638: Set connection var ansible_pipelining to False 28173 1726882791.83642: Set connection var ansible_shell_type to sh 28173 1726882791.83648: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882791.83654: Set connection var ansible_timeout to 10 28173 1726882791.83660: Set connection var ansible_shell_executable to /bin/sh 28173 1726882791.83666: Set connection var ansible_connection to ssh 28173 1726882791.83687: variable 'ansible_shell_executable' from source: unknown 28173 1726882791.83690: variable 'ansible_connection' from source: unknown 28173 1726882791.83693: variable 'ansible_module_compression' from source: unknown 28173 1726882791.83695: variable 'ansible_shell_type' from source: unknown 28173 1726882791.83698: variable 'ansible_shell_executable' from source: unknown 28173 1726882791.83700: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.83703: variable 'ansible_pipelining' from source: unknown 28173 1726882791.83706: variable 'ansible_timeout' from source: unknown 28173 1726882791.83709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.83809: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882791.83820: variable 'omit' from source: magic vars 28173 1726882791.83823: starting attempt loop 28173 1726882791.83825: running the handler 28173 1726882791.83927: variable 'interface_stat' from source: set_fact 28173 1726882791.83936: Evaluated conditional (not interface_stat.stat.exists): True 28173 1726882791.83940: handler run complete 28173 1726882791.83957: attempt loop complete, returning result 28173 1726882791.83960: _execute() done 28173 1726882791.83963: dumping result to json 28173 1726882791.83969: done dumping result, returning 28173 1726882791.83972: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest0' [0e448fcc-3ce9-926c-8928-0000000008b6] 28173 1726882791.83978: sending task result for task 0e448fcc-3ce9-926c-8928-0000000008b6 28173 1726882791.84057: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000008b6 28173 1726882791.84063: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28173 1726882791.84110: no more pending results, returning what we have 28173 1726882791.84113: results queue empty 28173 1726882791.84114: checking for any_errors_fatal 28173 1726882791.84123: done checking for any_errors_fatal 28173 1726882791.84124: checking for max_fail_percentage 28173 1726882791.84125: done checking for max_fail_percentage 28173 1726882791.84126: checking to see if all hosts have failed and the running result is not ok 28173 1726882791.84127: done checking to see if all hosts have failed 28173 1726882791.84128: getting the remaining hosts for this loop 28173 1726882791.84129: done getting the remaining hosts for this loop 28173 1726882791.84133: getting the next task for host managed_node2 28173 1726882791.84140: done getting next task for host managed_node2 28173 1726882791.84142: ^ task is: TASK: Verify network state restored to default 28173 1726882791.84144: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882791.84149: getting variables 28173 1726882791.84150: in VariableManager get_vars() 28173 1726882791.84184: Calling all_inventory to load vars for managed_node2 28173 1726882791.84187: Calling groups_inventory to load vars for managed_node2 28173 1726882791.84190: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.84200: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.84202: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.84205: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.87121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.89290: done with get_vars() 28173 1726882791.89320: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:158 Friday 20 September 2024 21:39:51 -0400 (0:00:00.070) 0:00:45.058 ****** 28173 1726882791.89411: entering _queue_task() for managed_node2/include_tasks 28173 1726882791.89777: worker is 1 (out of 1 available) 28173 1726882791.89829: exiting _queue_task() for managed_node2/include_tasks 28173 1726882791.89840: done queuing things up, now waiting for results queue to drain 28173 1726882791.89841: waiting for pending results... 28173 1726882791.90065: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 28173 1726882791.90140: in run() - task 0e448fcc-3ce9-926c-8928-0000000000f1 28173 1726882791.90152: variable 'ansible_search_path' from source: unknown 28173 1726882791.90185: calling self._execute() 28173 1726882791.90273: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.90279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.90289: variable 'omit' from source: magic vars 28173 1726882791.90565: variable 'ansible_distribution_major_version' from source: facts 28173 1726882791.90578: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882791.90584: _execute() done 28173 1726882791.90587: dumping result to json 28173 1726882791.90590: done dumping result, returning 28173 1726882791.90595: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0e448fcc-3ce9-926c-8928-0000000000f1] 28173 1726882791.90602: sending task result for task 0e448fcc-3ce9-926c-8928-0000000000f1 28173 1726882791.90688: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000000f1 28173 1726882791.90690: WORKER PROCESS EXITING 28173 1726882791.90749: no more pending results, returning what we have 28173 1726882791.90754: in VariableManager get_vars() 28173 1726882791.90785: Calling all_inventory to load vars for managed_node2 28173 1726882791.90787: Calling groups_inventory to load vars for managed_node2 28173 1726882791.90790: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.90807: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.90810: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.90814: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.91608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.93507: done with get_vars() 28173 1726882791.93521: variable 'ansible_search_path' from source: unknown 28173 1726882791.93531: we have included files to process 28173 1726882791.93531: generating all_blocks data 28173 1726882791.93533: done generating all_blocks data 28173 1726882791.93538: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 28173 1726882791.93538: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 28173 1726882791.93540: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 28173 1726882791.93828: done processing included file 28173 1726882791.93830: iterating over new_blocks loaded from include file 28173 1726882791.93831: in VariableManager get_vars() 28173 1726882791.93839: done with get_vars() 28173 1726882791.93840: filtering new block on tags 28173 1726882791.93851: done filtering new block on tags 28173 1726882791.93853: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 28173 1726882791.93856: extending task lists for all hosts with included blocks 28173 1726882791.94012: done extending task lists 28173 1726882791.94013: done processing included files 28173 1726882791.94014: results queue empty 28173 1726882791.94014: checking for any_errors_fatal 28173 1726882791.94016: done checking for any_errors_fatal 28173 1726882791.94017: checking for max_fail_percentage 28173 1726882791.94018: done checking for max_fail_percentage 28173 1726882791.94018: checking to see if all hosts have failed and the running result is not ok 28173 1726882791.94019: done checking to see if all hosts have failed 28173 1726882791.94019: getting the remaining hosts for this loop 28173 1726882791.94020: done getting the remaining hosts for this loop 28173 1726882791.94022: getting the next task for host managed_node2 28173 1726882791.94024: done getting next task for host managed_node2 28173 1726882791.94025: ^ task is: TASK: Check routes and DNS 28173 1726882791.94027: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882791.94028: getting variables 28173 1726882791.94029: in VariableManager get_vars() 28173 1726882791.94034: Calling all_inventory to load vars for managed_node2 28173 1726882791.94036: Calling groups_inventory to load vars for managed_node2 28173 1726882791.94038: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882791.94042: Calling all_plugins_play to load vars for managed_node2 28173 1726882791.94044: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882791.94045: Calling groups_plugins_play to load vars for managed_node2 28173 1726882791.94968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882791.96767: done with get_vars() 28173 1726882791.96793: done getting variables 28173 1726882791.96836: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:39:51 -0400 (0:00:00.074) 0:00:45.133 ****** 28173 1726882791.96860: entering _queue_task() for managed_node2/shell 28173 1726882791.97151: worker is 1 (out of 1 available) 28173 1726882791.97166: exiting _queue_task() for managed_node2/shell 28173 1726882791.97178: done queuing things up, now waiting for results queue to drain 28173 1726882791.97179: waiting for pending results... 28173 1726882791.97480: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 28173 1726882791.97601: in run() - task 0e448fcc-3ce9-926c-8928-0000000008e7 28173 1726882791.97624: variable 'ansible_search_path' from source: unknown 28173 1726882791.97631: variable 'ansible_search_path' from source: unknown 28173 1726882791.97679: calling self._execute() 28173 1726882791.97787: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.97800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.97815: variable 'omit' from source: magic vars 28173 1726882791.98218: variable 'ansible_distribution_major_version' from source: facts 28173 1726882791.98246: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882791.98262: variable 'omit' from source: magic vars 28173 1726882791.98302: variable 'omit' from source: magic vars 28173 1726882791.98327: variable 'omit' from source: magic vars 28173 1726882791.98361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882791.98410: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882791.98429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882791.98442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882791.98451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882791.98480: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882791.98485: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.98488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.98561: Set connection var ansible_pipelining to False 28173 1726882791.98567: Set connection var ansible_shell_type to sh 28173 1726882791.98575: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882791.98583: Set connection var ansible_timeout to 10 28173 1726882791.98586: Set connection var ansible_shell_executable to /bin/sh 28173 1726882791.98591: Set connection var ansible_connection to ssh 28173 1726882791.98608: variable 'ansible_shell_executable' from source: unknown 28173 1726882791.98611: variable 'ansible_connection' from source: unknown 28173 1726882791.98614: variable 'ansible_module_compression' from source: unknown 28173 1726882791.98616: variable 'ansible_shell_type' from source: unknown 28173 1726882791.98619: variable 'ansible_shell_executable' from source: unknown 28173 1726882791.98621: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882791.98623: variable 'ansible_pipelining' from source: unknown 28173 1726882791.98626: variable 'ansible_timeout' from source: unknown 28173 1726882791.98630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882791.99572: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882791.99576: variable 'omit' from source: magic vars 28173 1726882791.99579: starting attempt loop 28173 1726882791.99581: running the handler 28173 1726882791.99584: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882791.99587: _low_level_execute_command(): starting 28173 1726882791.99589: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882791.99591: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882791.99594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882791.99597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.99599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882791.99602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882791.99605: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882791.99607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.99609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882791.99611: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882791.99614: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882791.99616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882791.99618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882791.99621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882791.99623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882791.99683: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882791.99686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882791.99771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882791.99774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882791.99777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882791.99854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882792.01504: stdout chunk (state=3): >>>/root <<< 28173 1726882792.01655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882792.01683: stderr chunk (state=3): >>><<< 28173 1726882792.01687: stdout chunk (state=3): >>><<< 28173 1726882792.01791: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882792.01794: _low_level_execute_command(): starting 28173 1726882792.01804: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882792.0170567-30192-243964452115861 `" && echo ansible-tmp-1726882792.0170567-30192-243964452115861="` echo /root/.ansible/tmp/ansible-tmp-1726882792.0170567-30192-243964452115861 `" ) && sleep 0' 28173 1726882792.02334: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882792.02350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882792.02368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.02389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.02429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882792.02442: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882792.02456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.02478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882792.02490: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882792.02501: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882792.02512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882792.02526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.02541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.02551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882792.02559: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882792.02574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.02641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882792.02657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882792.02673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882792.02896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882792.04775: stdout chunk (state=3): >>>ansible-tmp-1726882792.0170567-30192-243964452115861=/root/.ansible/tmp/ansible-tmp-1726882792.0170567-30192-243964452115861 <<< 28173 1726882792.04962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882792.04970: stdout chunk (state=3): >>><<< 28173 1726882792.04973: stderr chunk (state=3): >>><<< 28173 1726882792.05287: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882792.0170567-30192-243964452115861=/root/.ansible/tmp/ansible-tmp-1726882792.0170567-30192-243964452115861 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882792.05291: variable 'ansible_module_compression' from source: unknown 28173 1726882792.05294: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28173 1726882792.05296: variable 'ansible_facts' from source: unknown 28173 1726882792.05299: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882792.0170567-30192-243964452115861/AnsiballZ_command.py 28173 1726882792.05367: Sending initial data 28173 1726882792.05370: Sent initial data (156 bytes) 28173 1726882792.06356: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882792.06372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882792.06388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.06405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.06446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882792.06457: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882792.06472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.06490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882792.06503: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882792.06515: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882792.06528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882792.06542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.06557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.06572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882792.06585: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882792.06600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.06680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882792.06697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882792.06712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882792.06981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882792.08719: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882792.08821: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882792.08917: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpr1gbaf69 /root/.ansible/tmp/ansible-tmp-1726882792.0170567-30192-243964452115861/AnsiballZ_command.py <<< 28173 1726882792.09016: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882792.10347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882792.10470: stderr chunk (state=3): >>><<< 28173 1726882792.10473: stdout chunk (state=3): >>><<< 28173 1726882792.10475: done transferring module to remote 28173 1726882792.10478: _low_level_execute_command(): starting 28173 1726882792.10481: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882792.0170567-30192-243964452115861/ /root/.ansible/tmp/ansible-tmp-1726882792.0170567-30192-243964452115861/AnsiballZ_command.py && sleep 0' 28173 1726882792.11094: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882792.11108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882792.11123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.11141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.11186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882792.11200: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882792.11214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.11233: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882792.11246: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882792.11258: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882792.11274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882792.11289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.11305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.11318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882792.11330: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882792.11345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.11422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882792.11444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882792.11462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882792.11681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882792.13410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882792.13414: stdout chunk (state=3): >>><<< 28173 1726882792.13416: stderr chunk (state=3): >>><<< 28173 1726882792.13470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882792.13474: _low_level_execute_command(): starting 28173 1726882792.13477: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882792.0170567-30192-243964452115861/AnsiballZ_command.py && sleep 0' 28173 1726882792.14037: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28173 1726882792.14040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.14092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882792.14095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28173 1726882792.14097: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.14100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.14158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882792.14181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882792.14188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882792.14302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882792.28187: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3072sec preferred_lft 3072sec\n inet6 fe80::104f:68ff:fe7a:deb1/64 scope link \n valid_lft forever preferred_lft forever\n30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 2e:06:5a:d7:92:57 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:39:52.271124", "end": "2024-09-20 21:39:52.279644", "delta": "0:00:00.008520", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882792.29499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882792.29503: stdout chunk (state=3): >>><<< 28173 1726882792.29505: stderr chunk (state=3): >>><<< 28173 1726882792.29659: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3072sec preferred_lft 3072sec\n inet6 fe80::104f:68ff:fe7a:deb1/64 scope link \n valid_lft forever preferred_lft forever\n30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 2e:06:5a:d7:92:57 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:39:52.271124", "end": "2024-09-20 21:39:52.279644", "delta": "0:00:00.008520", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882792.29666: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882792.0170567-30192-243964452115861/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882792.29670: _low_level_execute_command(): starting 28173 1726882792.29672: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882792.0170567-30192-243964452115861/ > /dev/null 2>&1 && sleep 0' 28173 1726882792.30126: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.30129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.30178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882792.30185: stderr chunk (state=3): >>>debug2: match not found <<< 28173 1726882792.30195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.30207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28173 1726882792.30214: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 28173 1726882792.30221: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28173 1726882792.30228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882792.30236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.30361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.30366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 28173 1726882792.30368: stderr chunk (state=3): >>>debug2: match found <<< 28173 1726882792.30370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.30571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882792.30575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882792.30578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882792.30645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882792.32480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882792.32485: stderr chunk (state=3): >>><<< 28173 1726882792.32487: stdout chunk (state=3): >>><<< 28173 1726882792.32502: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882792.32510: handler run complete 28173 1726882792.32531: Evaluated conditional (False): False 28173 1726882792.32536: attempt loop complete, returning result 28173 1726882792.32539: _execute() done 28173 1726882792.32542: dumping result to json 28173 1726882792.32547: done dumping result, returning 28173 1726882792.32554: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0e448fcc-3ce9-926c-8928-0000000008e7] 28173 1726882792.32560: sending task result for task 0e448fcc-3ce9-926c-8928-0000000008e7 28173 1726882792.32670: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000008e7 28173 1726882792.32673: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008520", "end": "2024-09-20 21:39:52.279644", "rc": 0, "start": "2024-09-20 21:39:52.271124" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3072sec preferred_lft 3072sec inet6 fe80::104f:68ff:fe7a:deb1/64 scope link valid_lft forever preferred_lft forever 30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether 2e:06:5a:d7:92:57 brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 28173 1726882792.32743: no more pending results, returning what we have 28173 1726882792.32747: results queue empty 28173 1726882792.32747: checking for any_errors_fatal 28173 1726882792.32749: done checking for any_errors_fatal 28173 1726882792.32749: checking for max_fail_percentage 28173 1726882792.32751: done checking for max_fail_percentage 28173 1726882792.32752: checking to see if all hosts have failed and the running result is not ok 28173 1726882792.32753: done checking to see if all hosts have failed 28173 1726882792.32754: getting the remaining hosts for this loop 28173 1726882792.32755: done getting the remaining hosts for this loop 28173 1726882792.32759: getting the next task for host managed_node2 28173 1726882792.32768: done getting next task for host managed_node2 28173 1726882792.32771: ^ task is: TASK: Verify DNS and network connectivity 28173 1726882792.32774: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882792.32778: getting variables 28173 1726882792.32779: in VariableManager get_vars() 28173 1726882792.32811: Calling all_inventory to load vars for managed_node2 28173 1726882792.32813: Calling groups_inventory to load vars for managed_node2 28173 1726882792.32816: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882792.32827: Calling all_plugins_play to load vars for managed_node2 28173 1726882792.32829: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882792.32831: Calling groups_plugins_play to load vars for managed_node2 28173 1726882792.34171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882792.35315: done with get_vars() 28173 1726882792.35332: done getting variables 28173 1726882792.35380: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:39:52 -0400 (0:00:00.385) 0:00:45.518 ****** 28173 1726882792.35402: entering _queue_task() for managed_node2/shell 28173 1726882792.35630: worker is 1 (out of 1 available) 28173 1726882792.35644: exiting _queue_task() for managed_node2/shell 28173 1726882792.35656: done queuing things up, now waiting for results queue to drain 28173 1726882792.35657: waiting for pending results... 28173 1726882792.35839: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 28173 1726882792.35917: in run() - task 0e448fcc-3ce9-926c-8928-0000000008e8 28173 1726882792.35927: variable 'ansible_search_path' from source: unknown 28173 1726882792.35931: variable 'ansible_search_path' from source: unknown 28173 1726882792.35960: calling self._execute() 28173 1726882792.36041: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882792.36045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882792.36054: variable 'omit' from source: magic vars 28173 1726882792.36329: variable 'ansible_distribution_major_version' from source: facts 28173 1726882792.36340: Evaluated conditional (ansible_distribution_major_version != '6'): True 28173 1726882792.36438: variable 'ansible_facts' from source: unknown 28173 1726882792.36926: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 28173 1726882792.36932: variable 'omit' from source: magic vars 28173 1726882792.36958: variable 'omit' from source: magic vars 28173 1726882792.36988: variable 'omit' from source: magic vars 28173 1726882792.37023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28173 1726882792.37048: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28173 1726882792.37068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28173 1726882792.37082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882792.37094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28173 1726882792.37117: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28173 1726882792.37121: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882792.37123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882792.37195: Set connection var ansible_pipelining to False 28173 1726882792.37199: Set connection var ansible_shell_type to sh 28173 1726882792.37206: Set connection var ansible_module_compression to ZIP_DEFLATED 28173 1726882792.37212: Set connection var ansible_timeout to 10 28173 1726882792.37216: Set connection var ansible_shell_executable to /bin/sh 28173 1726882792.37221: Set connection var ansible_connection to ssh 28173 1726882792.37240: variable 'ansible_shell_executable' from source: unknown 28173 1726882792.37242: variable 'ansible_connection' from source: unknown 28173 1726882792.37245: variable 'ansible_module_compression' from source: unknown 28173 1726882792.37247: variable 'ansible_shell_type' from source: unknown 28173 1726882792.37249: variable 'ansible_shell_executable' from source: unknown 28173 1726882792.37252: variable 'ansible_host' from source: host vars for 'managed_node2' 28173 1726882792.37254: variable 'ansible_pipelining' from source: unknown 28173 1726882792.37258: variable 'ansible_timeout' from source: unknown 28173 1726882792.37262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28173 1726882792.37365: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882792.37374: variable 'omit' from source: magic vars 28173 1726882792.37379: starting attempt loop 28173 1726882792.37382: running the handler 28173 1726882792.37391: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28173 1726882792.37408: _low_level_execute_command(): starting 28173 1726882792.37415: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28173 1726882792.37955: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.37975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.37996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882792.38009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 28173 1726882792.38019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.38058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882792.38080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28173 1726882792.38091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882792.38201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882792.39839: stdout chunk (state=3): >>>/root <<< 28173 1726882792.39944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882792.40000: stderr chunk (state=3): >>><<< 28173 1726882792.40006: stdout chunk (state=3): >>><<< 28173 1726882792.40028: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882792.40038: _low_level_execute_command(): starting 28173 1726882792.40044: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882792.4002647-30223-215583784104383 `" && echo ansible-tmp-1726882792.4002647-30223-215583784104383="` echo /root/.ansible/tmp/ansible-tmp-1726882792.4002647-30223-215583784104383 `" ) && sleep 0' 28173 1726882792.40502: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.40507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.40540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 28173 1726882792.40553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882792.40566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.40610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882792.40622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882792.40728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882792.42588: stdout chunk (state=3): >>>ansible-tmp-1726882792.4002647-30223-215583784104383=/root/.ansible/tmp/ansible-tmp-1726882792.4002647-30223-215583784104383 <<< 28173 1726882792.42701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882792.42749: stderr chunk (state=3): >>><<< 28173 1726882792.42752: stdout chunk (state=3): >>><<< 28173 1726882792.42770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882792.4002647-30223-215583784104383=/root/.ansible/tmp/ansible-tmp-1726882792.4002647-30223-215583784104383 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882792.42798: variable 'ansible_module_compression' from source: unknown 28173 1726882792.42838: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2817385jvvq10/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28173 1726882792.42872: variable 'ansible_facts' from source: unknown 28173 1726882792.42925: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882792.4002647-30223-215583784104383/AnsiballZ_command.py 28173 1726882792.43036: Sending initial data 28173 1726882792.43039: Sent initial data (156 bytes) 28173 1726882792.43719: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882792.43725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.43755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.43773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.43785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.43831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882792.43843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882792.43946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882792.45679: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28173 1726882792.45805: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28173 1726882792.45898: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2817385jvvq10/tmpepoozkkt /root/.ansible/tmp/ansible-tmp-1726882792.4002647-30223-215583784104383/AnsiballZ_command.py <<< 28173 1726882792.45992: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28173 1726882792.47327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882792.47408: stderr chunk (state=3): >>><<< 28173 1726882792.47411: stdout chunk (state=3): >>><<< 28173 1726882792.47433: done transferring module to remote 28173 1726882792.47442: _low_level_execute_command(): starting 28173 1726882792.47447: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882792.4002647-30223-215583784104383/ /root/.ansible/tmp/ansible-tmp-1726882792.4002647-30223-215583784104383/AnsiballZ_command.py && sleep 0' 28173 1726882792.47898: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28173 1726882792.47903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.47933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.47945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.48000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882792.48012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882792.48114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882792.49938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882792.49948: stdout chunk (state=3): >>><<< 28173 1726882792.49974: stderr chunk (state=3): >>><<< 28173 1726882792.49995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882792.50005: _low_level_execute_command(): starting 28173 1726882792.50015: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882792.4002647-30223-215583784104383/AnsiballZ_command.py && sleep 0' 28173 1726882792.51036: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.51039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.51084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.51087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.51143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882792.51146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882792.51270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882792.91516: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 10892 0 --:--:-- --:--:-- --:--:-- 11296\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1310 0 --:--:-- --:--:-- --:--:-- 1316", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:39:52.640295", "end": "2024-09-20 21:39:52.912941", "delta": "0:00:00.272646", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28173 1726882792.92753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 28173 1726882792.92804: stderr chunk (state=3): >>><<< 28173 1726882792.92807: stdout chunk (state=3): >>><<< 28173 1726882792.92825: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 10892 0 --:--:-- --:--:-- --:--:-- 11296\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1310 0 --:--:-- --:--:-- --:--:-- 1316", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:39:52.640295", "end": "2024-09-20 21:39:52.912941", "delta": "0:00:00.272646", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 28173 1726882792.92857: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882792.4002647-30223-215583784104383/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28173 1726882792.92865: _low_level_execute_command(): starting 28173 1726882792.92874: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882792.4002647-30223-215583784104383/ > /dev/null 2>&1 && sleep 0' 28173 1726882792.93292: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.93296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28173 1726882792.93343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.93346: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28173 1726882792.93349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28173 1726882792.93404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28173 1726882792.93408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28173 1726882792.93512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28173 1726882792.95413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28173 1726882792.95416: stderr chunk (state=3): >>><<< 28173 1726882792.95447: stdout chunk (state=3): >>><<< 28173 1726882792.95590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28173 1726882792.95594: handler run complete 28173 1726882792.95596: Evaluated conditional (False): False 28173 1726882792.95599: attempt loop complete, returning result 28173 1726882792.95601: _execute() done 28173 1726882792.95603: dumping result to json 28173 1726882792.95605: done dumping result, returning 28173 1726882792.95607: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-926c-8928-0000000008e8] 28173 1726882792.95609: sending task result for task 0e448fcc-3ce9-926c-8928-0000000008e8 ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.272646", "end": "2024-09-20 21:39:52.912941", "rc": 0, "start": "2024-09-20 21:39:52.640295" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 10892 0 --:--:-- --:--:-- --:--:-- 11296 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1310 0 --:--:-- --:--:-- --:--:-- 1316 28173 1726882792.95756: no more pending results, returning what we have 28173 1726882792.95759: results queue empty 28173 1726882792.95760: checking for any_errors_fatal 28173 1726882792.95773: done checking for any_errors_fatal 28173 1726882792.95774: checking for max_fail_percentage 28173 1726882792.95776: done checking for max_fail_percentage 28173 1726882792.95777: checking to see if all hosts have failed and the running result is not ok 28173 1726882792.95778: done checking to see if all hosts have failed 28173 1726882792.95778: getting the remaining hosts for this loop 28173 1726882792.95780: done getting the remaining hosts for this loop 28173 1726882792.95785: getting the next task for host managed_node2 28173 1726882792.95792: done getting next task for host managed_node2 28173 1726882792.95795: ^ task is: TASK: meta (flush_handlers) 28173 1726882792.95801: done sending task result for task 0e448fcc-3ce9-926c-8928-0000000008e8 28173 1726882792.95807: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882792.95813: WORKER PROCESS EXITING 28173 1726882792.95819: getting variables 28173 1726882792.95824: in VariableManager get_vars() 28173 1726882792.95856: Calling all_inventory to load vars for managed_node2 28173 1726882792.95859: Calling groups_inventory to load vars for managed_node2 28173 1726882792.95862: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882792.95878: Calling all_plugins_play to load vars for managed_node2 28173 1726882792.95881: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882792.95884: Calling groups_plugins_play to load vars for managed_node2 28173 1726882792.97778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882792.99643: done with get_vars() 28173 1726882792.99670: done getting variables 28173 1726882792.99750: in VariableManager get_vars() 28173 1726882792.99760: Calling all_inventory to load vars for managed_node2 28173 1726882792.99765: Calling groups_inventory to load vars for managed_node2 28173 1726882792.99767: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882792.99773: Calling all_plugins_play to load vars for managed_node2 28173 1726882792.99775: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882792.99778: Calling groups_plugins_play to load vars for managed_node2 28173 1726882793.02787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882793.07794: done with get_vars() 28173 1726882793.07837: done queuing things up, now waiting for results queue to drain 28173 1726882793.07840: results queue empty 28173 1726882793.07841: checking for any_errors_fatal 28173 1726882793.07846: done checking for any_errors_fatal 28173 1726882793.07846: checking for max_fail_percentage 28173 1726882793.07848: done checking for max_fail_percentage 28173 1726882793.07848: checking to see if all hosts have failed and the running result is not ok 28173 1726882793.07849: done checking to see if all hosts have failed 28173 1726882793.07850: getting the remaining hosts for this loop 28173 1726882793.07851: done getting the remaining hosts for this loop 28173 1726882793.07855: getting the next task for host managed_node2 28173 1726882793.07860: done getting next task for host managed_node2 28173 1726882793.07862: ^ task is: TASK: meta (flush_handlers) 28173 1726882793.07865: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882793.07868: getting variables 28173 1726882793.07869: in VariableManager get_vars() 28173 1726882793.07881: Calling all_inventory to load vars for managed_node2 28173 1726882793.07884: Calling groups_inventory to load vars for managed_node2 28173 1726882793.07886: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882793.07892: Calling all_plugins_play to load vars for managed_node2 28173 1726882793.07895: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882793.07898: Calling groups_plugins_play to load vars for managed_node2 28173 1726882793.10008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882793.11808: done with get_vars() 28173 1726882793.11831: done getting variables 28173 1726882793.11887: in VariableManager get_vars() 28173 1726882793.11902: Calling all_inventory to load vars for managed_node2 28173 1726882793.11904: Calling groups_inventory to load vars for managed_node2 28173 1726882793.11906: Calling all_plugins_inventory to load vars for managed_node2 28173 1726882793.11912: Calling all_plugins_play to load vars for managed_node2 28173 1726882793.11914: Calling groups_plugins_inventory to load vars for managed_node2 28173 1726882793.11917: Calling groups_plugins_play to load vars for managed_node2 28173 1726882793.14533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28173 1726882793.17316: done with get_vars() 28173 1726882793.17345: done queuing things up, now waiting for results queue to drain 28173 1726882793.17347: results queue empty 28173 1726882793.17348: checking for any_errors_fatal 28173 1726882793.17349: done checking for any_errors_fatal 28173 1726882793.17350: checking for max_fail_percentage 28173 1726882793.17351: done checking for max_fail_percentage 28173 1726882793.17351: checking to see if all hosts have failed and the running result is not ok 28173 1726882793.17352: done checking to see if all hosts have failed 28173 1726882793.17353: getting the remaining hosts for this loop 28173 1726882793.17354: done getting the remaining hosts for this loop 28173 1726882793.17356: getting the next task for host managed_node2 28173 1726882793.17359: done getting next task for host managed_node2 28173 1726882793.17360: ^ task is: None 28173 1726882793.17361: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28173 1726882793.17362: done queuing things up, now waiting for results queue to drain 28173 1726882793.17365: results queue empty 28173 1726882793.17365: checking for any_errors_fatal 28173 1726882793.17366: done checking for any_errors_fatal 28173 1726882793.17367: checking for max_fail_percentage 28173 1726882793.17368: done checking for max_fail_percentage 28173 1726882793.17368: checking to see if all hosts have failed and the running result is not ok 28173 1726882793.17369: done checking to see if all hosts have failed 28173 1726882793.17370: getting the next task for host managed_node2 28173 1726882793.17372: done getting next task for host managed_node2 28173 1726882793.17373: ^ task is: None 28173 1726882793.17374: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=90 changed=6 unreachable=0 failed=0 skipped=91 rescued=0 ignored=1 Friday 20 September 2024 21:39:53 -0400 (0:00:00.820) 0:00:46.338 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.83s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.77s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.68s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.68s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 1.51s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 1.49s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:6 Gathering Facts --------------------------------------------------------- 1.19s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Create veth interface ethtest0 ------------------------------------------ 1.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.03s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.01s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.00s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.98s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.96s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.94s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:149 fedora.linux_system_roles.network : Check which packages are installed --- 0.94s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.92s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Verify DNS and network connectivity ------------------------------------- 0.82s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Check if system is ostree ----------------------------------------------- 0.81s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 28173 1726882793.17488: RUNNING CLEANUP